Science and Technology

views updated May 18 2018

27
Science and Technology

Kristine Krapp
Genevieve Slomski

Early African American Inventors

Early African American Scientists

African Americans in Medicine

African Americans in Air and Space

Modern Contributions to Science and Technology

Engineers, Mathematicians, Inventors, Physicians, and Scientists

EARLY AFRICAN AMERICAN INVENTORS

Perhaps in science more than in other areas, African Americans have been afforded few sanctioned opportunities to offer contributions. However, will and intelligence helped individuals bring their ideas and dreams into the light. The Industrial Revolution swept African Americans along just as dramatically as it did the rest of the world. Though not all of them became household names, African Americans have made their mark in science and technology. For example, when Alexander Graham Bell invented the telephone, he chose Lewis Latimer to draft the plans. Later, Latimer became a member of the Edison Pioneers, a group of inventors who worked for Thomas Edison from 1884 to 1912.

One of the earliest African American stars of science was Benjamin Banneker, a free African American who lived in the 1700s. Considered the first African American scientist, Banneker was an expert in mathematics and astronomy, both of which he studied during his friendship with an influential white Quaker neighbor. In 1754, Banneker constructed what has been considered the first grandfather clock made in the United States. Later, Banneker and the Quaker’s son were selected to lay the plans for the city of Washington, D.C. Thus, not only was Banneker the first African American to receive a presidential appointment, he was one of the first African American civil engineers. In the early 1790s, his almanac—a year-long calendar loaded with weather and astronomical information that was especially useful to farmers—was published with much success. New editions were issued for several years.

In 1790, the U.S. government passed the U.S. Patent Act, legislation that extended patent rights to inventors, including free blacks. Slaves would not have this right until the passage of the Fourteenth Amendment. In one of history’s most absurd bureaucratic fiats, slaves could neither be granted patents nor could they assign patents to their masters. The underlying theory was that since slaves were not citizens, they could not enter into contracts with their owners or the government. As a result, the efforts of slaves were dismissed or credited to their masters. One can only speculate on the extent to which slaves were active in invention. For example, Joe Anderson, a slave, was believed to have played a major role in the creation of a grain harvester, or reaper, that his master Cyrus McCormick was credited with inventing, but available records are insufficient to determine the degree to which Anderson was involved. Similarly, Benjamin Montgomery, a slave belonging to Confederate President Jefferson Davis, is thought to have concocted an improved boat propeller. Since the race of patent-seekers was rarely noted and other African American inventions such as ice cream, created by Augustus Jackson

of Philadelphia in 1832, were simply never patented, one cannot be sure how many inventions were made by free blacks either.

The first free blacks to have their inventions recorded were Thomas L. Jennings, whose dry-cleaning methodology received patent protection in 1821, and Henry Blair who invented a seed planter in 1834. Free black Norbert Rillieux patented his sugar refining evaporator, thus revolutionizing the industry. The son of a French planter and a slave woman, Rillieux left his home in New Orleans to study engineering in Paris. After teaching mathematics there and experimenting with steam evaporation, he created his vacuum pan evaporator. With his invention, a single person could do work that once required several people working at once. He returned to the United States and became wealthy as the device was implemented in sugar refineries in his home state and abroad in Cuba and Mexico. However, racial tensions in the United States wore on him, and in 1854, he moved to France, where he spent the remainder of his life.

In 1848, free black Lewis Temple invented the toggle harpoon for killing whales, a major industry at the time. Temple’s invention almost completely replaced the type of harpoon formerly used as it greatly diminished the mammal’s ability to escape after being hooked. Prior to the Civil War, Henry Boyd created an improved bed frame, and James Forten, one of the few African Americans from that era to gain extreme wealth from an invention, produced a device that helped guide ship sails. He used the money he earned to expand his sail factory.

The Reconstruction era opened the door to creativity that had been suppressed in African Americans. Between 1870 and 1900, a time when nearly 80% of African American adults in the United States were illiterate, African Americans were awarded several hundred patents. Elijah McCoy worked as a locomotive fireman on a Michigan line lubricating the hot steam engines during scheduled train stops. After years of work, in 1872, McCoy perfected and patented an automatic lubricator that regularly supplied oil to the engine as the train was in motion. The effect on the increasingly important railway system was profound as conductors were no longer forced to make oiling stops. McCoy adapted his invention for use on ships and in factories. When copycats tried to steal his invention, the phrase “the real McCoy” came into vogue.

In 1884, Granville T. Woods invented an improved steam boiler furnace in his Cincinnati electrical engineering shop. Three years later, Woods patented an induction telegraph or “synchronous multiplex railway telegraph,” that allowed train personnel to communicate with workers on other trains while in motion. He was also responsible for what later became known as the trolley when he produced an overhead electrical power supply system for streetcars and trains. A prolific inventor, Woods, known as “The Black Edison,” patented more than 60 valuable inventions, including an airbrake, which he eventually sold to George Westinghouse.

Jan Matzeliger came to the United States from South America in 1877. Living in Lynn, Massachusetts, he obtained work in a shoe factory. There he witnessed the tedious process by which shoe soles were attached to shoe uppers by workers known as hand lasters. For six months he secretly labored at inventing a machine to automate the work. Unsatisfied with his original design, he spent several more years tweaking and perfecting his creation so that by the time he was granted a patent in 1883, the equipment was so successful that manufacturers the world over clamored for the gadgetry.

Progress has been a gift from women as well as men: Sarah Goode is credited with creating a folding cabinet bed in 1885; Sarah Boone invented the ironing board in 1892; and photographer Claytonia Dorticus was granted several patents that were concerned with photographic equipment and developing solutions as well as a shoe dye. But Madame C. J. Walker, often regarded only as an entrepreneur, was one of the most successful female inventors. She developed an entire line of hair care products and cosmetics for African Americans, claiming that her first idea had come to her in a dream.

During the next few years, Garrett Morgan patented a succession of products, including a hair straightening solution that was still a bestseller in as late as the 1970s; a gas mask, or breathing device, for firefighters; and an improved traffic signal. Morgan tried to pass himself off as Native American. However, once his identity as an African American was discovered, several of his purchase orders were canceled.

Nonetheless, the early inventors paved the way for future African Americans. These men and women, as well as the countless unknown ones, were forced to endure the byproducts of racism. Whites were oftentimes hesitant to buy African American inventions unless the reality of eventual monetary gains was too strong to ignore. McCoy, Woods, and several others died poor, although their creations sold extremely well.

EARLY AFRICAN AMERICAN SCIENTISTS

The contributions of African American scientists are better known than those of African American inventors, partly because of the recognition awarded to George Washington Carver, an agriculturalist, who refused to patent most of his inventions. Born into slavery in 1864, Carver was the first African American to graduate from Iowa Agricultural State College, where he studied botany and agriculture. One year after earning a master’s degree, Carver joined Tuskegee Institute’s Agriculture Department. In his role as department head, he engineered a number of experimental farming techniques that had practical applications for farmers in the area. Through his ideas, from crop rotation to replenishment of nutrient-starved soil to his advocacy of peanuts as a cash crop, Carver left an indelible mark in his field. An inventor at heart, he was behind the genesis of innumerable botanical products, byproducts, and even recipes. Recognition of his efforts came in several forms including induction into England’s Royal Society of Arts and Manufacturing and Commerce in 1916. In 1923, he received an NAACP Spingarn Medal. In 1949, six years after his death, Carver was the subject of a U.S. postal stamp.

Born approximately 10 years before Carver earned his bachelor’s degree, Ernest Everett Just was a pioneering marine biologist who had graduated magna cum laude from Dartmouth College in 1907. The first-ever recipient of a Spingarn Medal in 1915, his first paper was published as “The Relation of First Cleavage Plane to the Entrance Point of the Sperm” in 1912. The work showed how the location of cell division in the marine worm Nereis is determined by the sperm’s entry point on an egg. Just did the majority of his research at the Marine Biological Laboratory in Woods Hole, Massachusetts, where he spent many summers. Teaching at Howard for many years, he had a tenuous relationship with the school, paving the way for him to accept an offer to conduct research at the Kaiser Wilhelm Institute for Biology in Berlin, Germany. The first American to be invited to the internationally respected institution and remained there from 1929 to 1933, at which point the Nazi regime was surging to power. Because he preferred working abroad to being shut out of the best laboratories in the United States on the basis of race, Just spent the rest of his career in France, Italy, Spain, and Portugal.

African Americans have had successes in engineering and mathematics as well. In 1876, Edward Bouchet became the first African American to earn a doctorate from a university in the United States, when he acquired a Ph.D. in physics from Yale. In the twentieth century, Elmer Samuel Imes, husband of Harlem Renaissance writer Nella Larsen, received a Ph.D. in physics from the University of Michigan in 1918. In his dissertation, Imes took the works of white scientists Albert Einstein, Ernest Rutherford, and Niels Bohr, one step further, definitively establishing that quantum theory applied to the rotational states of molecules. His efforts would later play a role in space science.

Chemist Percy Julian carved a brilliant career for himself after obtaining a doctorate from Switzerland’s University of Vienna in 1931. His specialty was creating synthetic versions of expensive drugs. Much of his work was conducted at his Julian Research Institute in Franklin Park, Illinois. In the 1940s, another scientist, Benjamin Peery, switched his focus from aeronautical engineering to physics while still an undergraduate at the University of Minnesota. After garnering a Ph.D. from the University of Michigan, Peery went on to a lengthy career teaching astronomy at Indiana University, the University of Illinois, and Howard University.

Between 1875 and 1943, only eight African Americans were awarded doctorates in pure mathematics. David Blackwell became the first tenured African American professor at the University of California at Berkeley in 1955. An expert in statistics and probability, he was a trailblazer despite a racially motivated setback he incurred soon after completing his doctoral work at the University of Illinois. Nominated for a Rosenwald Fellowship from the Institute for Advanced Study at Princeton University, Blackwell was rejected because of his race. Undaunted, he went on to become the only African American mathematician to be elected into the National Academy of Sciences.

AFRICAN AMERICANS IN MEDICINE

The medical profession has yielded a number of African Americans of high stature. As early as the 1860s, African Americans had entered medical schools in the North and had gone on to practice as full-fledged physicians. In fact, during the Civil War, Dr. Alexander T. Augusta was named head of a Union army hospital and Rebecca Lee Crumpler became the first female African American doctor by graduating from the New England Female Medical College in Boston. She was able to attend on a scholarship that she received from Ohio Sen. Benjamin Wade, an abolitionist. She used her schooling to provide health care to former slaves in the former confederate capital of Richmond, Virginia. Her 1883 Book of Medical Discourses taught women how to address their own health issues, as well as those of their children.

Rebecca J. Cole was the second African American woman to become a physician and the first African American graduate of the Women’s Medical College of Pennsylvania. For over 50 years, she devoted her life to improving the lot of the poor. Her positions included performing a residency at the New York Infirmary for Women and Children and running Washington, D.C.’s Government House for Children and Old Women and Philadelphia’s Woman’s Directory, a medical aid center.

In 1867, Susan McKinney Steward began studying at the New York Medical College for Women. Three years later, she earned the distinction of being the third African American female physician in the United States and the first in New York State. She specialized in homeopathic treatments and had black and white patients of both genders as clients. After opening a second office in New York City, she helped co-found the Brooklyn Women’s Hospital and Dispensary. She also served at the Brooklyn Home for Aged Colored People. Steward vigorously supported the women’s suffrage movement and conducted missionary work with her second husband, a chaplain for the Buffalo Soldier regiment. She ended her career by taking on the role of school doctor at Wilberforce University.

In 1868, Howard University opened its College of Medicine, the first African American medical school in the country. The school nearly failed five years later when monetary problems arose and salaries for faculty were

unavailable. Thanks to the efforts of Dr. Charles Purvis, who convinced the school to let him and his peers continue teaching on a non-paid basis, the school survived the crisis. Purvis was later appointed chief surgeon of Washington, D.C.’s Freedman’s Hospital. Purvis was thus the first African American to run a civilian hospital. He did so until 1894, when he began a private practice.

Meanwhile, in 1876, Nashville’s Meharry Medical College was founded. Despite the decidedly low number of jobs for African American physicians who were routinely turned away from nearly every facility other than Freedman’s Hospital, the school was another sign of the slowly developing progress by African American physicians including Dr. Daniel Hale Williams, who replaced Purvis at Freedman’s. Williams advanced Freedman’s through internships, better nurses’ training, and the addition of horse-drawn ambulances.

Williams had graduated from the Chicago Medical College in 1883, and entered into private practice almost immediately. Business was slow until 1890, when he met Emma Reynolds, an aspiring African American nurse, whose skin color had kept her from gaining admission to any of the nursing schools in Chicago. Inspired by her unfortunate dilemma, Williams decided to operate his own hospital in hopes of initiating his own program for aspiring nurses. With 12 beds, Provident Hospital became the first African American operated facility in the United States, and Reynolds was the first to enroll in Williams’s classes. Near the end of his career, Williams was appointed the first African American associate surgeon at Chicago’s St. Luke Hospital and later was the only African American charter member of the American College of Surgeons. During his career, Williams helped convince 40 hospitals to treat African American patients.

African Americans in the South also received improved care in the late 1890s, thanks to Alice Woodby McKane and her spouse, who was also a doctor. In 1893, they founded the first training school for African American nurses—in Savannah, Georgia. McKane had obtained her medical degree one year earlier from the Women’s Medical College of Pennsylvania. In 1895, the couple set up their first hospital in Monrovia, Liberia, before establishing the McKane Hospital for Women and Children in Georgia the following year.

Progress moved westward as another African American woman used her training to benefit the region’s African American population, though her patients transcended all racial barriers. Beginning in 1902, Denver’s “Baby Doctor,” Justina Ford, proudly served her community as the only African American physician in Colorado for over 50 years. An obstetrician, she delivered more than 7,000 babies, conducting most of her business by house calls.

Back in the East, Freedman’s Hospital was the training ground for future head trauma authority, Dr. Louis Wright, a Harvard Medical School graduate whose high academic standing meant nothing to Boston area hospitals that refused to hire African Americans. When World War I erupted, Wright enlisted and found himself in charge of his unit’s surgical ward. After the war, Wright, who had received a Purple Heart, became the first African American physician to work in a New York City hospital when he was appointed to Harlem Hospital in 1919. Later he became director of surgery, president of the medical board, and was admitted to the American College of Surgeons. In 1948, four years before his death in 1952, he founded the Cancer Research Foundation at Harlem Hospital. The son of two physicians, his father and his stepfather, the latter of whom was the first African American graduate of Yale Medical School, Wright had two daughters who continued the family legacy by becoming doctors.

An almost legendary legacy was created by Dr. Charles R. Drew, a star high school athlete whose interest lay in medicine. A pathologist and expert on blood transfusions, Drew discovered that blood plasma was easier to store than whole blood and was compatible with all blood types. His experiments helped him receive a M.D. in 1940. During World War II, he helped Great Britain develop a national blood collection program, and was later asked to do the same for the U.S. Armed Forces. Unfortunately, racism reared its ugly head again—African American donors were first completely excluded from the program, and later were only allowed to donate to other African American servicemen. Frustrated, Drew withdrew from the program, briefly resuming his teaching career at Howard before joining the staff of Freedman’s Hospital as medical director.

Howard continued developing new talents. Dr. Roland Scott, a physician at Howard University’s College of Medicine, became a pioneer in the study and treatment of sickle cell anemia. His research was pivotal in drawing public attention to the disorder and prompting the U.S. government to devote money to more extensive study. Under the Sickle Cell Anemia Control Act passed in 1972, Congress forced the National Institutes of Health to set up treatment centers for patients. Scott was named director of the program that involved screening as well as treatment for those already afflicted.

In recent years, African Americans have made great strides in the fields of medicine. Alexa Canady became the first African American neurosurgeon in 1975. She continues to be a leader and innovator in the areas of craniofacial abnormalities, epilepsy, hydrocephalus, pediatric neurosurgery, and tumors of spinal cord and brain. She has also contributed to special research topics such as assist in the development of neuroendoscopic equipment, evaluating programmable pressure change valves in hydrocephalus, head injury, hydrocephalus and shunts, neuroendoscopy, and pregnancy complications of shunts. Another African American neurosurgeon who advanced the field to new heights was Dr. Ben Carson. In 1987, Carson led a 70-member surgical team at Johns Hopkins Hospital to separate Siamese twins who were joined at the cranium. A surgery like this had never been attempted before, but Carson was able to not only perform the surgery but also was able to save both twins with minimal brain damage.

Canady and Carson changed the way the medical profession handled injuries and complications to the brain. Ten years after Carson’s groundbreaking surgery, a pair of doctors changed the way that the medical community dealt with the subject of multiple births. Drs. Paula Mahone and Karen Drake are heads of a team of 40 specialists involved in the delivery of the McCaughey septuplets at Iowa Methodist Medical Center, the first set of septuplets to be born and survive in the United States. Mahone, Drake, Canady, and Carter are but a few examples of the major new discoveries and achievements that African Americans have made in the medical field.

African Americans have also begun to break into positions of power and prestige in the medical community. In 1989, Renee Rosilind Jenkins became the first African American president of the Society of Adolescent Medicine. In 1990, Roselyn Epps took over as the first African American president of the American Medical Women’s Association and the first woman president of the DC Medical Society a year later. Dr. Barbara Ross-Lee was named the Dean of Ohio University College of Osteopathic Medicine, making her the first African-American woman to lead a U.S. medical school.

These rises in positions in the medical community for African Americans were also seen on a national level. In 1993, Dr. Joycelyn Elders became the first African American to be appointed as U.S. Surgeon General. Six years later, in 1998, Dr. David Satcher was sworn in as both U.S. surgeon general and as assistant secretary of the Department of Health and Human Services.

AFRICAN AMERICANS IN AIR AND SPACE

In 1920, Texan Bessie Coleman was accepted at the French flying school, École d’Aviation des Freres, following a string of rejections from aviation schools in the United States. Having completed seven months of instruction and a rigorous qualifying exam, she earned her international aviator’s license from the Federation Aeronautique Internationale the following year and went on to study further with aircraft designer Anthony H. G. Fokker. Known to an admiring public as “Queen Bess,” Bessie Coleman was the first African American woman ever to fly an airplane, the first American to earn an international pilot’s license, and the first African American female stunt pilot. During her brief yet distinguished career as a performance flier, she appeared at air shows and exhibitions across the country, earning wide recognition for her aerial skill, dramatic flair, and tenacity. The tragic demise of the professional aviatrix occurred in 1926, when she was scheduled to parachute jump from a speeding plane at 2,500 feet. Ten minutes after takeoff, however, the plane careened wildly out of control, flipping over and dropping Coleman, who plunged 500 feet to her death. Though he remained in the aircraft, the pilot was instantly killed when the plane crashed to the ground. Later, a service wrench mistakenly left behind in the engine was found to have been the cause of the accident.

Six years later, in 1932, pilot James Herman Banning and mechanic Thomas C. Allen flew from Los Angeles to New York City in 41 hours and 27 minutes. The transcontinental flight was followed by the first round-trip transcontinental flight the next year. That feat was accomplished by Albert Ernest Forsythe and Charles Alfred Anderson, who flew from Atlantic City to Los Angeles and back in 11 days, foreshadowing the advent of commercial flight.

Willa B. Brown became the first African American woman to hold a commercial pilot’s license in the United States in 1934. She was also the first African American woman to ascend to the rank of lieutenant in the Civil Air Patrol Squadron. Brown later founded the National Airmen’s Association of America, the first aviators group for African Americans. With her husband, Cornelius R. Coffey, she established the first African American-owned flying school—Coffey School of Aeronautics—and the first African American-owned school to receive certification from the Civil Aviation Authority. Brown became the first African American member of the Federal Aviation Agency’s Women’s Advisory Council in 1972.

The second African American woman to earn a full commercial pilot’s license was Janet Harmon Bragg, a Georgia nurse who took an interest in flying when she began dating Johnny Robinson, one of the first African American aviation instructors. The first woman of any race to be admitted to Chicago’s Curtiss Wright Aeronautical University, she was initially denied her commercial license despite having successfully fulfilled all preliminary requirements including the airborne portion of the test. Her white instructor from the Federal Aviation Administration made it quite clear, however, that he would not grant a license to an African American woman. Rather than give up, Bragg merely tested again with another instructor the same year and was granted her license in 1942. Along with a small group of African American aviation devotees, she formed the Challengers Air Pilots Association (CAPA). Together, members of CAPA opened an airport in Robins, Illinois, the first owned and operated by African Americans.

Other African American notables in the field of aviation include: Perry H. Young, who, in 1957 became the first African American pilot for a scheduled passenger commercial airline, New York Airways; Otis B. Young Jr., who, in 1970, was the first African American pilot of a jumbo jet; and former naval pilot Jill Brown, who became the first African American female to pilot for a major airline in 1987.

Military men were the first African Americans to enter into the line of space exploration. In 1961, U.S. Air Force Captain Edward Dwight was invited by President John F. Kennedy to apply to test-pilot school. Two years later, Dwight was in the midst of spaceflight training when Kennedy was assassinated. Without the president’s support, Dwight was pretty much ignored by National Aeronautics and Space Administration (NASA). Air Force Major Robert H. Lawrence thus became the first African American astronaut a few years later. A doctor of physical chemistry, Lawrence was killed in a plane crash in December 1967, just six months after his selection by NASA. African Americans would not make inroads in space until the genesis of the Space Shuttle program.

African American scientists were, however, prevalent. For example, Katherine Johnson joined the National Advisory Committee on Aeronautics, the precursor to NASA, in 1953. Initially, all she was asked to do was basic number crunching, but she spent a short period filling in at the Flight Research Division. There her valued interpretation of data helped in the making of prototype spacecraft, and she soon developed into an aerospace technologist. She developed trajectories for the Apollo moon-landing project and devised emergency navigational methods for astronauts. She retired in 1986.

Emergencies of another sort have been tackled by air force flight surgeon Vance Marchbanks, whose research showed that adrenaline levels could affect the exhaustion level of flight crews. His work brought him to the attention of NASA, and he became a medical observer for NASA’s Project Mercury. Along with several other personnel scattered about the globe, Marchbanks, stationed in Nigeria, was responsible for monitoring pioneering astronaut John Glenn’s vital signs as he orbited the earth in 1962. Later, Marchbanks received the civilian post of chief of environmental health services for United Aircraft Corporation, where he had a hand in designing the space suit and medical monitoring systems used in the Apollo moon shot.

Also specializing in design, aeronautical test engineer Robert E. Shurney spent nearly his entire career, from 1968 to 1990, at the Marshall Space Flight Center, specializing in design utility. His products included refuse disposal units that stored solids in the bottom and liquid in tubes to prevent any materials from floating openly and contaminating an entire cabin. The units were used in the Apollo program, Skylab, and on the first space shuttle missions. He also crafted strong, yet lightweight, aluminum tires for the lunar rover. Much of his experimentation was conducted on KC-135 test planes in order to achieve the condition of weightlessness.

Assertiveness enabled O. S. Williams to bring forth his own achievements. In 1942, Williams talked his way into employment at Republic Aviation Corporation as part of the technical staff. Better known as Ozzie, he took the experience he earned there to NASA contractor Grumman Corporation. The small rocket engines that he co-developed saved the lives of the Apollo 13 astronauts when the ship’s main rocket exploded during flight in 1970.

Three missions later, George Carruthers, a Naval Research Laboratory astrophysicist, designed the farultraviolet camera/spectograph for use on Apollo 16. The semiautomatic device was able to photograph deep space—regions too far to be captured by regular cameras—once set up on the surface of the moon. Carruthers, who earned a Ph.D. in aeronautical and astronautical engineering from the University of Illinois in 1964 and was granted his first patent in 1969 for an electromagnetic radiation image converter.

With a 1965 Ph.D. in atomic and molecular physics from Howard University, Carruther’s contemporary, George E. Alcorn, has been one of the most prominent people working with semiconductors and spectrometers. Working for private industry, including IBM and NASA, Alcorn has over 25 patents to his name including secret projects concerning missile systems.

In a less clandestine fashion, aerospace engineer Christine Darden has been a leading NASA researcher in supersonic and hypersonic aircraft. Her main goal has been the reduction of sonic boom, a phenomenon that creates an explosive burst of sound that can traumatize those on the ground. Darden works at manipulating an aircraft’s wing or the shape of its nose to try to control the feedback produced by air waves resulting from a plane’s flight.

Dealing with people rather than machinery, director of psychophysiology at NASA’s Ames Research Center, Patricia Cowings’ postdoctoral work has touched upon such fields as aerospace medicine and bioastronautics. Since the late 1970s, she has assisted astronauts by teaching them biofeedback techniques—how to impose mind over matter when zero gravity wreaks havoc with one’s system. By studying physical and emotional problems that arise in such a setting, she can seek the cause and prescribe a therapy to alleviate stress. She was also the first woman of any race in the United States to receive astronaut training.

These individuals are joined by numerous others in the field of aviation and space flight including chemical engineer Henry Allen Jr., a liquid and solid rocket fuel specialist; missile expert and inventor Otis Boykin; health services officer Julian Earls; aerospace technologist Isabella J. Coles; astrodynamicist Robert A. Gordon; and operations officer Isaac Gillam IV, to name a few. Once the Space Shuttle program began in earnest, however, African Americans also took to the skies.

In the field of astrophysics, Neil deGrasse Tyson, author of six books on astronomy and named one of the 50 Most Important Blacks in Research Science in 2004, is Frederick P. Rose Director of the Hayden Planetarium at the American Museum of Natural History in New York City—the youngest director in the planetarium’s history. His research interests include star formation, exploding stars, dwarf galaxies, and the structure of the Milky Way.

Traveling in the Space Shuttle Challenger, U.S. Air Force Colonel Guion “Guy” Bluford was the first African American to fly in space, where he coordinated experiments and was in charge of deploying satellites. After his first mission in 1983, Bluford participated in three more. Astronaut Ronald McNair was aboard the tragic Challenger flight of 1986, his second trip on the shuttle. The vehicle exploded 73 seconds after liftoff, killing all seven crew members. Charles Bolden’s first mission was aboard the 1986 flight of the Space Shuttle Columbia. He has also flown on the Discovery. The first African American to pilot a space shuttle was Frederick Drew Gregory, who

did so in 1985, on his first journey to outer space. A veteran pilot of both helicopters and airplanes, Gregory became an astronaut in 1979. Gregory also made history on his fourth flight, when he commanded the first mission comprised of Russians and Americans. Mae Jemison went into space as a science specialist in 1992’s joint U.S.Japanese project on the shuttle Endeavour. The following year, Bernard Harris took off in the Space Shuttle Columbia. He served as a mission specialist in Spacelab-D2, alongside Germans and Americans.

MODERN CONTRIBUTIONS TO SCIENCE AND TECHNOLOGY

The achievements of African American inventors and scientists of the mid- to-late twentieth century have been obscured by reasons more complex than blatant racial prejudice. The main reasons include the advent of government and corporate research and development teams. Such work, whether contracted or direct, often precludes individual recognition, regardless of a person’s race. Nonetheless, in the corporate world as well as in academia, African American scientists and engineers play a substantial role in the development of solid state devices, high-powered and ultra-fast lasers, hypersonic flight—2,000–3,000 miles per hour—and elementary particle science. African American engineers employed by NASA in managerial and research positions have made, and continue to make, considerable contributions.

African American manufacturing and servicing firms in various computer and engineering areas have been established. For example, African American entrepreneur Marc Hannah has made a niche for himself in the field of computer graphics as cofounder of Silicon Graphics Incorporated. Chief scientist and vice president of the company, Hannah has adapted his electrical engineering and computer science know-how to a variety of applications including military flight simulators and CAT scan devices. In addition, his computer-generated, 3-D special effects have been featured in such major films as Terminator 2 (1991), Aladdin (1992), and Jurassic Park (1993).

Academia has more African American science and technology faculty members, college presidents, and school of engineering deans than in the past. Many of these academics are serving in the country’s most prestigious institutions. However, this progress has not continued, and there is cause for concern in the future. The 1970s was a decade of tremendous growth for minorities in science and engineering. In the 1980s, though, there was a progressive decline in the production of African American scientists, even though the numbers of Asian American and women scientists were still growing. In 1977, for example, people of color earned 13% of science and engineering doctorates, with Asian Americans at three percent of those. By 1993, 16% of the degrees went to people of color, and Asian Americans earned 7% of those degrees. In addition, women earned 40% of science and engineering doctorates in 1993, up from 25% in 1977. The numbers of African Americans entering scientific fields has slowly increased since the late 1980s, although they continue to be grossly underrepresented. Another area in which African Americans have been faltering is medicine.

In the mid- to late-1990s, the number of African American applicants to medical school was declining at a high rate. The search for potential African American physicians has been nearing crisis-level status. The repercussions of this shortage includes difficulty for the poor and elderly in finding African American attendants if they so desire. Primary care specialists—internists, pediatricians, obstetricians, gynecologists, etc.—were particularly in demand.

The health care profession began responding to this problem in 1991, when the Association of American Medical Colleges initiated Project 3000 by 2000—the primary aim being to graduate 3,000 minorities by the year 2000. In particular, Xavier University was the top school in the country for African American placement into medical school, gaining a reputation for placing an average of 70% of its premed seniors into medical schools each year. Meanwhile, African American doctors already in practice were forming cooperatives amongst themselves in order to serve those African American patients who were discriminated against by Health Maintenance Organizations (HMOs) that considered them too poor or sick to be participants.

The situation is not as dire in engineering, perhaps due in part to a mentoring program established in 1975, by the National Action Council for Minorities in Engineering (NACME). With industry backing, the council has focused on youngsters as early as the fourth-grade level. More than 4,700 of their students have acquired engineering degrees and their graduates make up 10% of all engineers from minority groups. However, there is some indication that fewer African Americans are entering engineering fields since 1980. As of 1996, about 29% of the college-age population was made up of African Americans, Latinos, and Native Americans. This same group, though, accounted for less than 3% of engineering doctoral recipients.

Still, the importance of role models with names and faces cannot be overlooked. Some African American scientists have entered into the public consciousness; for example, in 1973, Shirley Ann Jackson became the first African American woman in the United States to earn a Ph.D. in theoretical particle physics as well as the first female African American to earn a Ph.D. from the prestigious Massachusetts Institute of Technology (MIT). She has had a distinguished career, culminating with her appointment as chair of the Nuclear Regulatory Commission by President Bill Clinton in 1995.

Another African American rose to the position of National Science Foundation (NSF) director, the highest science-related administrative post in the United States. Holder of a physics Ph.D. from St. Louis’s Washington University, Walter Massey was able to create a number of programs to provide science-oriented training to young African Americans. During his two-year stint at the NSF, from 1991 to 1993, Massey repeated the kind of success he had when he began the Inner City Teachers Science program while teaching at Brown University.

In the field of medical research, Charles Whitten founded the National Association for Sickle Cell Disease in 1971. His work has been complemented more recently by Griffin Rodgers, chief of the Molecular Hematology Unit at

the National Institutes of Health. In the 1990s, Rodgers was working on an experimental anti-cancer drug that could possibly provide benefits for sickle cell anemia patients.

Patients with prostate cancer have been encouraged by the work of Detroit-based urologist and oncologist Isaac Powell. In 1995, the Centers for Disease Control and Prevention named his screening program as the outstanding community health project of the year. Powell has been pursuing the idea of advanced diagnostic testing for African American men. Through a partnership with the Karmanos Cancer Institute and area churches, nurses, and hospitals, Powell has been able to educate the public about the importance of undergoing prostate cancer screening. Benefiting from a prostate-specific antigen test, patients have had their cancer caught early enough to undergo successful surgery. In 1996, Powell’s program was being exported to other cities in the United States.

The cancer research of a young African American biologist, Jill Bargonetti, has garnered much attention. She discovered a correlation between a specific gene’s ability to bind with the genetic matter known as DNA and its ability to suppress tumors. In 1996, she received a $300,000, three-year grant from the American Cancer Society and a $200,000, four-year award from the Department of Defense to pursue her study of breast cancer.

Outside of medical research, one-time Olympic athlete and engineering physicist Meredith Gourdine earned a Ph.D. from the California Institute of Technology in 1960. The Olympic medalist then formed Gourdine Systems, a research and development firm geared towards patenting inventions that use state-of-the-art power sources developed from advanced research in physics. Though blinded by diabetes in 1973, Gourdine went on to launch Energy Innovations the next year. An inventor at heart, he has more than 70 patents in his name and was inducted into the Black Inventors Hall of Fame.

The energy of earthquakes motivates geophysicist Waverly Person. His interest in seismology paid off when he was named director of the U.S. Geological Survey’s National Earthquake Information Center in 1977. The first African American earthquake scientist, Person is also the first African American in more than 30 years to hold such a prominent position in the U.S. Department of the Interior.

Similarly, meteorologist Warren Washington has been concerned with the earth’s climate. Since 1987, the greenhouse effect expert has been director of the Climate and Global Dynamics Division of the National Center for Atmospheric Research. After seven years there, he was elected to a one-year term as the first African American president of the American Meteorological Society. Afterwards, Washington co-founded the Black Environmental Science Trust, introducing African American children to science.

In the 1990s and into the new millennium, personal computers and the World Wide Web have been one of the largest areas of scientific and industrial invention. One of the fathers of the Internet, Philip Emeagwali created a formula in 1989 that used 65,000 separate computer processors to perform 3.1 billion calculations per second. This feat allowed computer scientists to comprehend the capabilities of computers to communicate with one another. Another innovator is Omar Wasow, the executive director of Blackplanet.com who created the “community” strategy of running Web domains that almost all Internet sites utilize to maximize efficiency. African Americans have also used the computer to create programs that are aiding in everyday life and government. One such example is Athan Gibbs who introduced in 2002 the Gibbs’ Tru Voter Validation System, a computer touchscreen system that allows voters to touch the picture of the candidate they chose to vote for, eliminating confusion among voters and possible corruption of ballots.

Along with hundreds of other notable African Americans, scientists have been working toward restoring scientific education at all levels. Their presence, whether inside or outside of the public eye, is felt. Younger African Americans who learn of their endeavors are thus encouraged to free their creative science minds.

RAINBOW/PUSH COALITION SEEKS GREATER MINORITY REPRESENTATION IN SILICON VALLEY

Having addressed the lack of employment opportunities for African Americans on Wall Street, the Reverend Jesse Jackson used the same policy during 1999 toward the Silicon Valley—a segment of the economy where whites hold more than 90% of the chief executive officer jobs and board seats at the top 150 public corporations. Seeking greater African American representation in this prominent high-tech region of California, Jackson’s Rainbow/PUSH Coalition purchased approximately $100,000 worth of stock in 50 of the largest high-tech corporation and announced future plans of opening a staffed office in San Jose, assembling an advisory board of influential Silicon Valley executives to suggest methods ofincreasing African American and Hispanic American participation in the region’s workforce, and hosting a conference that will address methods to effectively educate minorities for high-tech careers.

While acknowledging that nearly 31% of the high-tech industry’s engineers and professionals are Asian American and that Silicon Valley is a major employer of immigrants, Jackson is exerting pressure on corporations to reach beyond their usual networks to work with minority-owned businesses in order to widen their pool of money and talent. With the support of a number of African American chief executive officers—Frank S. Greene of New Vista Capital, Robert E. Knowling of Covad Communications, Roy Clay of Rod-L Electronics; and Kenneth L. Coleman of Silicon Graphics Inc.—Jackson hopes to end the so-called “color-blind” hiring practices that high-tech corporate executives claim to apply, which, in Jackson’s opinion, prevents them from recognizing minority markets. (According to Target Market News, African Americans annually spend $3.8 billion on computer and consumer electronic gear.) Jackson acknowledges that since the beginning of his operation, high-tech companies have begun to embrace diversity in the workplace.

ENGINEERS, MATHEMATICIANS, INVENTORS, PHYSICIANS, AND SCIENTISTS

(To locate biographical profiles more readily, please consult the index at the back of the book.)

GEORGE E. ALCORN (1940– ) Physicist

George Edward Alcorn was born on March 22, 1940. He graduated with a B.A. in physics from Occidental College in 1962, and a M.A. in nuclear physics from Howard University in 1963. In 1967, he earned his Ph.D. from Howard University in atomic and molecular physics. After earning his Ph.D., Alcorn spent 12 years working in industry.

Alcorn left IBM, where he had worked as a second plateau inventor, to join NASA in 1978. While at NASA, Alcorn invented an imaging x-ray spectrometer using thermomigration of aluminum, for which he earned a patent in 1984, and two years later devised an improved method of fabrication using laser drilling. His work on imaging x-ray spectrometers earned him the 1984 NASA/Goddard Space Flight Center (GSFC) Inventor of the Year Award. During this period, he also served as deputy project manager for advanced development and was responsible for developing new technologies required for the space station Freedom. He also managed the GSFC Evolution Program, concerned with ensuring that over its 30-year mission the space station develops properly while incorporating new capabilities.

From 1992 to 2004, Alcorn served as chief of Goddard’s Office of Commercial Programs, supervising programs for technology transfer, small business innovation research, and the commercial use of space programs. He managed a shuttle flight experiment that involved Robot Operated Material Processing System (ROMPS) in 1994. In 2005, he became Assistant Director for Standards/Excellence in Applied Engineering and Technology at Goddard.

In 1999, Alcorn received the Government Executive Magazine’s Government Technology Leadership Award for developing and commercializing a mapping system called the Airborne Lidar Topographical Mapping System. In 2001, he was awarded special congressional recognition for his efforts in assisting Virgin Islands businesses use NASA technology and programs.

Alcorn holds over 25 patents. He is a recognized pioneer in the fabrication of plasma semiconductor devices, and his patent, Process for Controlling the Slope of a Via Hole, was an important contribution to the process of plasma etching. This procedure is now used by many semiconductor manufacturing companies. Alcorn was one of the first scientists to present a computer-modeling solution of wet etched and plasma etched structure, and has received several cash prizes for his inventions of plasma-processing techniques.

ARCHIE ALEXANDER (1888–1958) Civil Engineer

Born in 1888 in Ottumwa, Iowa, Archie Alphonso Alexander graduated from the University of Iowa with a B.S. in civil engineering in 1912. During his collegiate years, he was a star football player who earned the nickname Alexander the Great on the playing field.

His first job was as a design engineer for the Marsh Engineering Company, a company that specialized in building bridges. Two years later, in 1914, Alexander formed his own company, A.A. Alexander, Inc. Most of the firm’s contracts were for bridges and sewer systems. So successful was he that the NAACP awarded him its Spingarn Medal in 1928. The following year, he formed Alexander and Repass with a former classmate. Alexander’s new company was also responsible for building tunnels, railroad trestles, viaducts, and power plants. Some of Alexander’s biggest accomplishments include the Tidal Basin Bridge and K Street Freeway in Washington, D.C.; a heating plant for his alma mater, the University of Iowa; a civilian airfield in Tuskegee, Alabama; and a sewage disposal plant in Grand Rapids, Michigan.

A member of Kappa Alpha Psi, Alexander was awarded their “Laurel Wreath” for great accomplishment in 1925. Alexander received honorary civil engineering degrees from the University of Iowa in 1925 and Howard University in 1946. The following year, Alexander was named one of the University of Iowa’s outstanding alumni and “one of the first hundred citizens of merit.” Politically active, Alexander was appointed Governor of the Virgin Islands in 1954 by President Dwight Eisenhower, though he was forced to resign one year later due to health problems. He died at his home in Des Moines, Iowa, in 1958.

BENJAMIN BANNEKER (1731–1806) Mathematician/Statistician, Astronomer, Surveyor/Explorer, Publisher

Benjamin Banneker was born on November 9, 1731, on a tobacco farm near Baltimore, Maryland. His mother was a free woman, and his father was her slave, whom she purchased and married. At the age of 21, Banneker became interested in watches and later constructed a grandfather clock based upon a pocket watch he had seen, calculating the ratio of the gears and wheels and carving them from wood. The clock operated for more than 40 years.

Banneker’s aptitude for mathematics and knowledge of astronomy enabled him to accurately predict the solar eclipse of 1789. By 1791, he began publishing an almanac, which contained tide tables, weather information, data on future eclipses, and a listing of useful medicinal products and formulas. The almanac, which was the first scientific book published by an African American, appeared annually for more than a decade. Banneker sent a copy to Thomas Jefferson, and the two corresponded, debating the subject of slavery.

Banneker served as a surveyor on the six-person team that helped lay out the base lines and initial boundaries for Washington, D.C. When the chairman of the committee, Major Pierre Charles L’Enfant, abruptly resigned and returned to France with his plans, Banneker was able to reproduce the plans from memory in their entirety. He died on October 25, 1806.

ANDREW J. BEARD (1849–1921) Railroad Porter, Inventor

Inventor Andrew Jackson Beard was born a slave in Eastlake, Alabama, in 1849. While working in an Alabama railroad yard, Beard had seen men lose hands, arms, legs, and even their lives in accidents occurring during the manual coupling of railroad cars. The system in use involved the dropping of a metal pin at exactly the right moment when two cars met. Men were often caught between cars and crushed to death during this split-second operation.

Beard’s invention, the “Jenny Coupler,” was an automatic device that secured two cars by merely bumping them together. In 1897, Beard received $50,000 for an invention that has since prevented the death or maiming of countless railroad workers.

DAVID BLACKWELL (1919– ) Mathematician

Considered by some to be the greatest African American mathematician of all time, David Blackwell, the eldest of four children, was born on April 24, 1919, in Centralia, Illinois. As a schoolboy, Blackwell gravitated toward the study of geometry, and after taking a course in introductory analysis, fell in love with mathematics.

Following high school graduation, Blackwell began mathematics study at the University of Illinois in 1935, when he was 16 years old, and graduated with an A.B. in 1938, an A.M. in 1939, and a Ph.D. in 1941, when he was 22. After graduation, he received an appointment to a prestigious postdoctoral fellowship—the Rosenwald Fellowship—at Princeton University’s Institute for Advanced Study. Although his colleagues wished to renew his one-year appointment, the president of the university organized a protest and admonished the Institute for offering the fellowship to an African American. However, during this period—the 1950s and 1960s—Martin Luther King, Jr. and other civil rights leaders brought the issue of racial equality to the forefront of the American consciousness, and Blackwell was able to rise in the ranks of academia at other institutions.

Blackwell obtained instructorships at Southern University and Clark College, and in 1944 was named to the faculty of Howard University. In three years, Blackwell advanced from his instructor position to full professor, then to chairman of the department. He also spent time at RAND Corporation and was a visiting professor of Statistics at Stanford University from 1950 to 1951.

While at Stanford, Blackwell met M. A. Girschick, a faculty member in the agriculture department. The two men began a collaboration that lasted about a decade and resulted in many articles and a book, Theory of Games and Statistical Decisions (1954), which became a classic in the field. Blackwell published over 50 articles and another book during his long career.

Blackwell returned to Howard University, where he remained until 1954. He then was appointed professor of statistics and chairman of the statistics department at the University of California at Berkeley, where he remained until his retirement in 1989. In his career, Blackwell published over 80 papers in topics that included Baysian statistics, probability, game theory, set theory, dynamic programming, and information theory.

Blackwell served as president of the Institute of Mathematical Sciences in 1955; vice president of the American Statistical Association, the International Statistics Institute, and the American Mathematical Society; and was president of the Bernoulli Society. In 1965, he became the first African American to be named to the American Academy of Sciences. In addition, he was named honorary fellow of the Royal Statistical Society and a member of the American Academy of Arts and Sciences.

Among his numerous honorary degrees and prizes, Blackwell was awarded the Von Neumann Theory Prize by the Operations Research Society of America in 1979. In 2002, the Mathematical Sciences Research Institute at Berkeley and Cornell University established the Blackwell-Tapia Award in honor of David Blackwell and Richard A. Tapia, mathematicians who have inspired generations of African American and Hispanic American students and professionals in the mathematical sciences.

GUION S. BLUFORD JR. (1942– ) Space/Atmospheric Scientist, Aerospace Engineer, Air Force Officer, Airplane Pilot

Guion “Guy” Bluford was born November 22, 1942, in Philadelphia. He graduated with a B.S. from Pennsylvania State University in 1964. He then enlisted in the U.S. Air Force and was assigned to pilot training at Williams Air Force Base in Arizona. Bluford served as a fighter pilot in Vietnam and flew 144 combat missions, 65 of them over North Vietnam. Attaining the rank of lieutenant colonel, Bluford received an M.S. from the Air Force Institute of Technology in 1974 and a Ph.D. in aerospace engineering in 1978.

In 1979, Bluford was accepted in NASA’s astronaut program as a mission specialist. On August 30, 1983, with the liftoff of the Challenger shuttle, Bluford became the first African American in space. He flew three other

space shuttle missions aboard Challenger in 1985, and aboard Discovery in 1991 and 1992, for a total of 688 hours in space.

Bluford retired from NASA in 1993 to pursue a career in private industry, where he has served as vice president and general manager of Engineering Services Division of NYMA, Inc. In this position, Bluford directs engineers, scientists, and technicians who provide engineering support to NASA’s Lewis Research Center.

Bluford has won numerous awards including the Distinguished National Science Award given by the National Society of Black Engineers (1979), NASA Group Achievement Award (1980, 1981), NASA Space Flight Medal (1983), and the NAACP Image Award in 1983. Some of his military honors include the National Defense Service Medal (1965), Vietnam Campaign Medal (1967), Air Force Commendation Medal (1972), Air Force Meritorious Service Award (1978), the USAF Command Pilot Astronaut Wings (1983), and the NASA Distinguished Service Medal (1994). In 1997, Bluford was inducted into the International Space Hall of Fame.

CHARLES F. BOLDEN JR. (1946– ) Airplane Pilot, Space/Atmospheric Scientist, Marine Officer, Operations and Systems Researcher/Analyst

Born in Columbia, South Carolina, and a graduate of the U.S. Naval Academy and the University of Southern California, Charles Bolden Jr. has a B.S. in electrical science and a M.S. in systems management. Bolden began his career as a second lieutenant in the U.S. Marine Corps, becoming a naval aviator by 1970. In 1973, he flew more than 100 sorties while assigned in Thailand. Upon his return to the United States, Bolden began a tour as a Marine Corps selection and recruiting officer. In 1979, he graduated from the U.S. Naval Test Pilot School and was assigned to the Naval Test Aircraft Directorates.

Bolden was selected as an astronaut candidate by NASA in May of 1980, and, in July of 1981, completed the training and evaluation program—making him eligible

for assignment as a pilot on space shuttle flight crews. A veteran of four shuttle missions, Bolden has served as pilot for the Hubble Space Telescope deployment mission and was commander of the first joint American-Russian space shuttle mission. In 1994, he accepted a position at the Naval Academy.

Bolden has been awarded the Defense Superior Service Medal, the Defense Meritorious Service Medal, the Air Medal, the Legion of Merit, and the Strike/Flight Medal. In 1998, he was promoted to the rank of major general. He served as Commanding General, 3rd Marine Aircraft Wing from 2000 to 2002. He retired in 2004.

In 2002, Bolden was nominated for the position of deputy administrator of the National Aeronautics and Space Administration (NASA). However, due to a Defense Department order limiting the service of senior military officers in civilian jobs during the U.S. antiterrorism offensive, the highly qualified candidate’s nomination was withdrawn from consideration.

MARJORIE L. BROWNE (1914–1979) Mathematician/Statistician, Educator

Browne was born September 9, 1914, in Memphis, Tennessee. She received a B.S. in mathematics from Howard University in 1935, an M.S. from the University of Michigan in 1939, and a Ph.D. in mathematics, again from the University of Michigan, in 1949. Browne was one of the first two African American women to earn a Ph.D. in mathematics. She taught at the University of Michigan in 1947 and 1948. She accepted the post of professor of mathematics at North Carolina Central University in 1949 and became department chairperson in 1951. In 1960, she received a grant from IBM to establish one of the first computer centers at a minority university.

Browne’s doctoral dissertation dealt with topological and matrix groups, and she was published in the American Mathematical Monthly. She was a fellow of the National Science Foundation in 1958–1959 and again in 1965–1966. Browne was a member of the American Mathematical Society, the Mathematical Association of America, and the Society for Industrial and Applied Mathematics. She died in 1979.

ALEXA I. CANADY (1950– ) Neurosurgeon

Alexa Irene Canady, the first African American woman to become a neurosurgeon in the United States, was born to Elizabeth Hortense (Golden) Canady and Clinton Canady, Jr., a dentist, on November 7, 1950, in Lansing, Michigan. Canady was recognized as a National Achievement Scholar while in high school, and attended the University of Michigan where she received a B.S. in 1971 and an M.D. in 1975. As a medical student, she was elected to Alpha Omega Alpha honorary medical society and received the American Medical Women’s Association citation.

Canady’s internship was spent at the Yale-New Haven Hospital from 1975 to 1976. She gained her landmark residency in neurosurgery at the University of Minnesota from 1976 to 1981. She was the first female and first African American to be granted a residency in neurosurgery at the hospital. Following her residency, Children’s Hospital in Philadelphia awarded Canady a fellowship in pediatric neurosurgery in 1981–1982. In addition to treating patients directly, Canady served as an instructor in neurosurgery at the University of Pennsylvania College of Medicine.

In 1982, she accepted a position at Henry Ford Hospital in Detroit. The following year, Canady transferred to pediatric neurosurgery at Children’s Hospital of Michigan, where she became the assistant director of neurosurgery at Children’s Hospital three years later, and director in 1987.

She began teaching at Wayne State University School of Medicine as a clinical instructor in 1985, and assumed a clinical associate professorship in 1987. Canady retired in 2001, at which point she was the chief of neurosurgery at Children’s Hospital of Michigan. Among her numerous honors and awards, Canady has been inducted into the Michigan Women’s Hall of Fame, and was named Woman of the Year in 1993 by the American Women’s Medical Association. Throughout her career, Canady worked toward changing the perception of African Americans as patients and as physicians.

GEORGE R. CARRUTHERS (1939– ) Astrophysicist

Dr. George Carruthers is one of the two naval research laboratory people responsible for the Apollo 16 lunar surface ultraviolet camera/spectrograph, which was placed on the lunar surface in April 1972. It was Carruthers who designed the instrument while William Conway adapted the camera for the lunar mission. The spectrographs, obtained from 11 targets, include the first photographs of the ultraviolet equatorial bands of atomic oxygen that girdle the earth. The camera was also used on Skylab in 1974.

Carruthers, who was born on October 1, 1939, in Cincinnati, Ohio, grew up on Chicago’s South Side. He built his first telescope at the age of 10. He received his Ph.D. in aeronautical/astronautical engineering from the University of Illinois in 1964.

In 1966, Carruthers became a research assistant at the Navy’s E. O. Hulburt Center for Space Research, where he began work on the lunar surface ultraviolet camera/spectrograph. The images received from the moon gave researchers invaluable information about the Earth’s atmosphere, including possible new ways to control air pollution. The images also aided in the detection of hydrogen in deep space—evidence that plants are not the only source of Earth’s oxygen.

Carruthers has continued his research, and is currently head of the Ultraviolet Measurements Group in the Space Science Division of the Naval Research Laboratory. He is the recipient of the NASA Exceptional Scientific Achievement medal for his work on the ultraviolet camera/spectrograph. Carruthers also won the Arthur S. Fleming Award in 1971, and the 2000 Outstanding Scientist Award presented by the National Institute of Science. In 2003, he was inducted into the National Inventor Hall of Fame in Akron, Ohio. In 2004, he was chosen as one of 50 Most Important Blacks in Research Science.

BEN CARSON (1951– ) Neurosurgeon

Born Benjamin Solomon Carson on September 18, 1951, in Detroit, Michigan, Dr. Carson has been recognized throughout the international medical community for his prowess in performing complex neurosurgical procedures, particularly on children with pediatric brain tumors, his main focus.

Among his accomplishments are a number of successful hemispherectomies, a process in which a portion of the brain of a critically ill seizure victim or other neurologically diseased patient is removed to radically reduce the incidence of seizures. Carson’s most famous operation took place in 1987, earning him international acclaim. That year he successfully separated a pair of West German Siamese or conjoined twins, who had been attached at the backs of their heads. The landmark operation took 22 hours; Carson led a surgical team of 70 doctors, nurses, and technicians.

Carson was raised in Detroit. His mother, Sonya Carson, who dropped out of school in the third grade, married at 13. She divorced when Carson was eight years old and raised her two sons as a single mother who worked several jobs. Known for his quick temper, Carson was a problem student—he almost killed a peer during a knife fight when he was 14 years old. He was also a failing student, and although she could barely read herself, his mother imposed a reading program on him and limited his television viewing until his grades improved.

In high school, he continued to excel and was accepted at Yale University in 1969 with a scholarship. With a B.A. in psychology from that Ivy League institution, Carson entered the University of Michigan, where his interest shifted to neurosurgery and where he obtained his M.D. in 1977. For one year, he served as a surgical intern at the Johns Hopkins Hospital, later doing his residency there. From 1983 to 1984, Carson practiced at the Sir Charles Gairdner Hospital in Perth, Australia. In 1984, at 33 years of age, he became the youngest chief of pediatric neurosurgery in the United States. Then, in 1985, Johns Hopkins named him director of the division of pediatric neurosurgery. Carson is a professor of neurosurgery, plastic surgery, oncology, and pediatrics at Johns Hopkins. He is also the co-director of the Johns Hopkins Craniofacial Center.

In 2002, Carson, a devout Seventh-day Adventist, was diagnosed with a high-grade prostate cancer that was treated successfully with surgery. He later stated that his experience as a patient made him a more empathetic physician and more devoted to disease prevention.

Carson has authored three best-selling books: Gifted Hands (1990), Think Big (1992), and The Big Picture (1999). He has also established a scholarship fund, called the Carson Scholars Fund, with the aid of his wife, Candy Carson, and co-founded the Benevolent Endowment Fund to assist underinsured and uninsured patients requiring brain surgery. Carson has been awarded over 33 honorary degrees and numerous awards for his work.

GEORGE WASHINGTON CARVER (c. 1864–1943) Educator, Agricultural/Food Scientist, Farmer

George Washington Carver devoted his life to research projects connected primarily with Southern agriculture. The products he derived from the peanut and the soybean revolutionized the economy of the South by liberating it from an excessive dependence on cotton.

Born a slave in Diamond Grove, Missouri, sometime between 1861 and 1865, Carver was only an infant when his mother was abducted from his owner’s plantation by a band of slave raiders. His mother was sold and shipped away, and Carver was raised by his mother’s owners, Moses and Susan Carver. Carver was a frail and sickly child, and he was assigned lighter chores around the house. Later, he was allowed to attend high school in a neighboring town.

Carver worked odd jobs while he pursued his education. He was the first African American student admitted at Simpson College, Indianola, Iowa. He then attended Iowa Agricultural College (now Iowa State University) where, while working as the school janitor, he received a degree in agricultural science in 1894. Two years later he received a master’s degree from the same school and became the first African American to serve on its faculty. Within a short time, his fame spread, and Booker T. Washington offered him a post at Tuskegee Institute. It was at the Institute’s Agricultural Experimental Station that Carver did most of his work.

Carver revolutionized the Southern agricultural economy by showing that 300 products could be derived from the peanut. By 1938, peanuts had become a $200 million industry and a chief product of Alabama. Carver also demonstrated that 100 different products could be derived from the sweet potato.

Although he did hold three patents, Carver never patented most of the many discoveries he made while at Tuskegee, saying, “God gave them to me, how can I sell them to someone else?” In 1940, he established the George Washington Carver Foundation and willed the rest of his estate to the organization, so his work might be carried on after his death. He died on January 5, 1943.

JEWEL PLUMMER COBB (1924– ) Cell Biologist

Born in Chicago on January 17, 1924, Jewel Cobb grew up exposed to a variety of African American professionals through her parents. Her father, Frank Plummer, was a physician, and her mother, Claribel (Cole) Plummer, was an interpretive dancer. By 1950, she had completed her M.S. and Ph.D. in biology. As a cell biologist, her focus was the action and interaction of living cells. She was particularly interested in tissue culture, in which cells are grown outside of the body and studied under microscopes.

Among her most important work was her study—with Dorothy Walker Jones—of how new cancer-fighting drugs affect human cancer cells. Cobb also conducted research into skin pigment. She was particularly interested in melanoma, a form of skin cancer, and melanin’s ability to protect skin from damage caused by ultraviolet light.

Cobb noted the scarcity of women in scientific fields, and she wrote about the difficulties women face in a 1979 paper “Filters for Women in Science.” In this piece, Cobb argued that various pressures, particularly in the educational system, act as filters that prevent many women from choosing science careers. The socialization of girls has tended to discourage them from pursuing math and the sciences from a very early age, and even those women who got past such obstacles have struggled to get university tenure and the same jobs (at equal pay) as men.

Cobb has been president emeritus of California State University in Fullerton since 1990. She has been active in her community, recruiting women and minorities to the sciences and founding a privately funded gerontology center.

Among Cobb’s numerous honors and awards is the National Science Foundation Lifetime Achievement Award for Contributions to the Advancement of Women and Underrepresented Minorities in 1993.

W. MONTAGUE COBB (1904–1990) Anthropologist, Organization Executive/Founder, Medical Researcher, Educator, Editor

William Montague Cobb was born on October 12, 1904, in Washington, D.C. For more than 40 years, he was a member of the Howard University Medical School faculty; thousands of medical and dental students studied under his direction. At Howard, he built a collection of more than 600 documented skeletons and a comparative anatomy museum in the gross anatomy laboratory. In addition to a B.A. from Amherst College, an M.D. from Howard University, and a Ph.D. from Case Western Reserve, he received many honorary degrees. Cobb died on November 20, 1990, in Washington, D.C.

As editor of the Journal of the National Medical Association for 28 years, Cobb developed a wide range of scholarly interests manifested by the nearly 700 published works under his name in the fields of medical education, anatomy, physical anthropology, public health and medical history. He was the first African American elected to the presidency of the American Association of Physical Anthropologists and served as the chairman of the anthropology section of the American Association for the Advancement of Science. Among his many scientific awards is the highest award given by the American Association of Anatomists. For 31 years, he was a member of the board of directors of the NAACP and served as the president of the board from 1976 to 1982.

PRICE M. COBBS (1928– ) Psychiatrist, Author, Management Consultant

Cobbs was born Price Mashaw Cobbs in Los Angeles, California, on November 2, 1928, and followed in his father’s path when he enrolled in medical school after earning a B.A. from the University of California at Berkeley in 1954. He graduated from Meharry Medical College with an M.D. in psychiatric medicine in 1958 and, within a few years, had established his own San Francisco practice in psychiatry.

With his academic colleague at the University of California, William H. Grier, Cobbs co-authored the groundbreaking 1968 study, Black Rage. In it, the authors argued that a pervasive social and economic racism had resulted in an endemic anger that stretched across all strata of African American society, from rich to poor; this anger was both apparent and magnified by the social unrest of the 1960s. Cobbs and Grier also co-authored a second book, The Jesus Bag (1971), which discussed the role of organized religion in the African American community. In 2003, Cobbs co-authored Cracking the Corporate Code with Judith L. Turnock.

A seminar Cobbs held in 1967 with other mental health care professionals eventually led him to found his own diversity training company, Pacific Management Systems (PMS). Since its inception, the company has been instrumental in providing sensitivity training for Fortune 500 companies, community groups, law enforcement bodies, and social service agencies.

A member of numerous African American professional and community organizations, as well as an assistant clinical professor at the University of California at San Francisco, Cobbs continues to guide PMS well into its third decade. The firm has pioneered the concept of ethnotherapy, which uses the principles of group therapy to help seminar participants rethink their attitudes toward members of other ethnic groups, the disabled, and those of alternative sexual orientations.

PATRICIA S. COWINGS (1948– ) Psychophysiologist

Patricia Suzanne Cowings, the first female scientist trained as an astronaut—although she never was sent into space—was born in New York City on December 15, 1948. She was fascinated by astronomy as a young girl, and this interest continued through high school at New York City’s High School of Music and Arts and college.

Cowings received her B.S. in psychology from the State University of New York, Stony Brook, in 1970, and her M.A. and Ph.D. in psychology from the University of California, Davis, in 1973. She received a research scholarship at the National Aeronautics and Space Administration (NASA), then taught for some time but returned to NASA in 1977.

Cowings has spent 23 years at the NASA Ames Research Center in Moffett’s Field, California, where she is director of the Gravitational Research Branch and the science director of the Psychophysiological Research Laboratory. She worked with Russian cosmonauts for five years. Cowings spent 15 years as the principal investigator for space shuttle experiments on autogenic feedback training, a method she developed for preventing zero-gravity sickness, which is comparable with motion sickness on Earth. Her program, which combines training and biofeedback, offers an effective method for controlling such processes as heart rate, blood pressure, and body temperature during space flight. She believes in the ability of the mind to control the body’s responses in adapting to an unfamiliar environment.

Cowings has received numerous honors and awards, including the Research Leadership Award at the 11th Annual National Women of Color Technology Awards Conference in 2006.

ELBERT F. COX (1895–1969) Educator, Mathematician/Statistician

Cox was born in Evansville, Indiana, on December 5, 1895. He received his B.A. from Indiana University in 1917 and his Ph.D. from Cornell University in 1925. His dissertation dealt with polynomial solutions and made Cox the first African American to be awarded a Ph.D. in pure mathematics.

Cox was an instructor at Shaw University (1921– 1923), a professor in physics and mathematics at West Virginia State College (1925–1929), and an associate professor of mathematics at Howard University from 1929 to 1947. In 1947, he was made full professor and retired in 1966.

During his career, Cox specialized in interpolation theory and differential equations. Among his professional accolades were memberships in such educational societies as Beta Kappa Chi, Pi Mu Epsilon, and Sigma Pi Sigma. He was also active in the American Mathematical Society, the American Physical Society, and the American Physics Institute. Cox died on November 28, 1969.

ULYSSES G. DAILEY (1885–1961) Editor, Health Administrator, Surgeon, Diplomat

From 1908 to 1912, Ulysses Grant Dailey served as surgical assistant to Dr. Daniel Hale Williams, founder of Provident Hospital and noted heart surgeon. Born in Donaldsonville, Louisiana, in 1885, Dailey graduated in 1906 from Northwestern University Medical School, where he was appointed a demonstrator in anatomy. He later studied in London, Paris, and Vienna, and in 1926 set up his own hospital and sanitarium in Chicago. Dailey was associated with Provident Hospital after his own hospital closed in 1932, and he retained a position there until his death.

A member of the editorial board of the Journal of the National Medical Association for many years, Dailey traveled around the world in 1933 under the sponsorship of the International College of Surgeons, of which he was a founder fellow. In 1951, and 1953, the U.S. State Department sent him to Pakistan, India, Ceylon, and Africa. One year later, he was named honorary consul to Haiti, moving there in 1956 when he retired.

CHRISTINE M. DARDEN (1942– ) Aerospace Engineer

Christine Darden was born on September 10, 1942, in Monroe, North Carolina. Valedictorian of class at Allen High School in Asheville, North Carolina, she attended the Hampton Institute in Hampton, Virginia, and graduated with a B.S. in mathematics in 1962.

Darden joined NASA as a mathematician in 1967, but her interests began to shift to engineering. After teaching high school mathematics for a time, Darden began taking engineering classes at Virginia State College in Petersburg, Virginia. She received her M.S. in applied mathematics from Virginia State College in 1978. In 1983, Darden earned a Ph.D. in mechanical engineering, specializing in fluid mechanics (the flow of gases and liquids) from George Washington University in Washington, D.C.

Darden returned to NASA and was given her first independent research project, which led to her study of sonic boom—an explosive sound caused by the shock wave that occurs when an airplane flies faster than the speed of sound. She later headed a team in the Supersonic Transport Research Project. Darden attempted to replicate the computer model in a real system to test her theory that the sonic boom of a supersonic flight could be minimized. She succeeded in her attempt when she was serving as a consultant to the joint Defense Advanced Research Projects Agency.

Darden serves as senior project engineer in the Advanced Vehicles Division at the NASA Langley Research Center in Hampton, Virginia, where she is the head of the Sonic Boom Group. Her team works on aircraft that fly faster than the speed of sound (supersonic aircraft). By using wind tunnels and modifying the design of aircraft, her group’s simulations of supersonic flight have identified aircraft shapes that could reduce the effects of sonic boom. Reducing sonic boom, combined with new technology, will enable significant reductions in air-flight times. As of 2006, Darden directs the Strategic Communications and Education program at NASA.

CHARLES R. DREW (1904–1950) Educator, Medical Researcher, Health Administrator, Surgeon/Physician

Using techniques already developed for separating and preserving blood, Charles Drew pioneered further into the field of blood preservation and organized procedures

from research to a clinical level, leading to the founding of the world’s two largest blood banks just prior to World War II. Born on June 3, 1904 in Washington, D.C., Drew graduated from Amherst College in Massachusetts, where he received the Messman Trophy for having brought the most honor to the school during his four years there. He was not only an outstanding scholar, but the captain of the track team and a star halfback on the football team.

After receiving his medical degree from McGill University in 1933, Drew returned to Washington, D.C., to teach pathology at Howard. In 1940, while taking his D.Sc. degree at Columbia University, he wrote a dissertation on “banked blood,” and soon became such an expert in this field that the British government called upon him to set up the first blood bank in England. He also initiated the use of bloodmobiles, trucks equipped with refrigerators to transport blood and plasma to remote locations.

During World War II, Drew was appointed director of the American Red Cross blood donor project. Later, he served as chief surgeon at Freedmen’s Hospital in Washington, D.C., as well as professor of surgery at Howard University Medical School from 1941 to 1950. A recipient of the 1944 Springarn Medal, Drew was killed in an automobile accident on April 1, 1950.

JOYCELYN ELDERS (1933– ) Physician, Endocrinologist, Former U.S. Surgeon General

Dr. Joycelyn Elders was born Minnie Lee Jones on August 13, 1933, in Schaal, Arkansas. The first of eight children, she grew up working in cotton fields. An avid reader, Jones earned a scholarship to Philander Smith College in Little Rock. Jones studied biology and chemistry in hopes of becoming a lab technician. She was inspired toward greater ambitions after meeting Edith Irby Jones (no relation), the first African American woman to study at the University of Arkansas School of Medicine. In college, Jones changed her name to Minnie Jocelyn Lee, but later dropped the name Minnie.

After obtaining her B.A., Jones served as a physical therapist in the U.S. Army in order to fund her postgraduate education. She was able to enroll in the University of Arkansas School of Medicine in 1956. However, as the only African American woman, and one of only three African American students, she and the other African Americans were forced to use a separate university dining facility—the one provided for the cleaning staff.

After a brief first marriage following college, Jones married Oliver B. Elders in 1960. The newly dubbed Joycelyn Elders fulfilled a pediatric internship at the University of Minnesota, and then returned to Little Rock in 1961 for a residency at the University of Arkansas Medical Center. Her success in the position led her to be appointed chief pediatric resident, in charge of the all-white, all-male battery of residents and interns.

During the next 20 years, Elders forged a successful clinical practice, specializing in pediatric endocrinology (the study of glands). She published more than 100 papers in that period and rose to professor of pediatrics, a position she maintained from 1976 until 1987, when she was named director of the Arkansas Department of Health.

Over the course of her career, Elders’ focus shifted somewhat from diabetes in children to sexual behavior. At the Department of Health, Elders was able to pursue her public advocacy in regards to teenage pregnancy and sexually transmitted diseases. In 1993, U.S. President Bill Clinton nominated Elders for the U.S. Surgeon General post, making her the second African American and the fifth woman to hold the cabinet position. Though her confirmation was not unchallenged—many decried her liberal stance—she was formally voted into approval for the position by the Senate on September 7, 1993.

During her tenure, Elders attacked Medicaid for failing to help poverty-stricken women prevent unwanted pregnancies and faulted pharmaceutical companies for overpricing contraceptives. Between 1993 and December of 1994, she spoke out in support of the medicinal use of marijuana, in favor of studying drug legalization, family planning, and against toy guns for children. An uproar was raised when Elders was reported to have recommended that masturbation be discussed in schools as part of human sexuality. She was forced to resign by Clinton in December 1994.

Elders returned to the University of Arkansas Medical School, and resumed teaching, though the state’s General Assembly budget committee tried to block her return. In 1995, she was hosting a daily talk show on AM stations KYSG in Little Rock and WERE in Cleveland. That same year, she joined the board of the American Civil Liberties Union. In 1996, her autobiography, entitled Joycelyn Elders MD: From Sharecropper’s Daughter to Surgeon General of America, was published.

Although Elders retired from medicine in 1999, she continues to lecture throughout the United States on health-related and civic issues.

PHILIP EMEAGWALI (1954– ) Mathematician, Engineer, Computer Scientist

Philip Chukwurah Emeagwali, considered a “father” of the Internet, was born on August 23, 1954, in Akure, Nigeria. A young math prodigy, Emeagwali’s high school studies were interrupted when he and his family—members of the Igbo tribe—were forced to flee to eastern Nigeria during the country’s civil war. After living in refugee camps and serving as a cook in the Biafran army, Emeagwali entered Christ the King College in Onitsha, and in 1973, earned a general certificate of education from the University of London.

Emeagwali ventured to the United States the following year to pursue further studies. He earned a B.S. in mathematics from Oregon State University in 1977, an M.S. in civil engineering from George Washington University in 1981, a post master’s degree in ocean, coastal, and marine engineering in 1986, an M.A. in applied mathematics from the University of Maryland in 1986, and a Ph.D. in scientific computing from the University of Michigan in 1993. Emeagwali became a citizen of the United States in 1981 when he married Dale Brown, a renowned scientist and cancer researcher.

In the mid-1970s, Emeagwali began working with computers, eventually becoming a supercomputer programmer. By 1989, he had set the world’s fastest computational record, using 65,000 separate computer processors to perform 3.1 billion calculations per second. The work was important in that it enabled scientists to grasp the capacities of supercomputers, as well as the theoretical uses of linking numerous computers to communicate—an idea that eventually led to the creation of the Internet. For his work, Emeagwali earned the 1989 Gordon Bell Prize, a highly coveted award in the computing world. Later, Emeagwali’s background in engineering combined with his knowledge of computers also led to an important breakthrough in the field of petroleum engineering.

Emeagwali, who works as an independent consultant and dubs himself a “public intellectual,” has performed engineering duties for the Maryland State Highway Commission and the U.S. Bureau of Reclamation. In addition to work as a researcher at the National Weather Service and the University of Michigan during the 1980s, he was a research fellow at the University of Minnesota’s Army High Performance Computing Research Center from 1991 to 1993. He is the recipient of over 100 awards, including Computer Scientist of the Year, National Technical Association, 1993; Eminent Engineer, Tau Beta Pi National Engineering Honor Society, 1994; International Man of the Year, Minority Technology Council of Michigan, 1994; Best Scientist in Africa Award of the Pan African Broadcasting, Heritage and Achievement Awards (2001), and Gallery of Prominent Refugees of the United Nations (2001).

SOLOMON C. FULLER (1872–1953) Neurologist, Psychiatrist

Born on August 11, 1872, in Monrovia, Liberia, Solomon C. Fuller was the son of Solomon Fuller, a coffee planter and government official whose father had been a slave in Virginia. In 1889, he sailed to the United States and earned his M.D. from Boston University School of Medicine in 1897.

By 1900, he had started his own study of mental patients, and four years later, traveled to Germany to study with Emil Kraepelin and Alois Alzheimer, the discoverer of the disease that bears his name. During his stay in Germany, Fuller had an opportunity to spend an afternoon with Paul Ehrlich, who in 1908 would win the Nobel Prize for his research in immunology.

Fuller’s most significant contribution was in the study of Alzheimer’s disease. By the latter part of the twentieth century, scientists still had not reached full agreement as to its cause. At the time of Fuller’s work, the prevailing belief was that arteriosclerosis, or hardening of the arteries, caused Alzheimer’s. Fuller disagreed and put forth this opinion in the course of diagnosing the ninth documented case of Alzheimer’s. Proof of his ideas came in 1953, the year he died, when other medical researchers would confirm the lack of any linkage between arteriosclerosis and Alzheimer’s.

HELENE D. GAYLE (1955– ) Epidemiologist, AIDS Researcher

Helene Gayle was born on August 16, 1955, in Buffalo, New York, the third of five children of an entrepreneur father and social worker mother. After graduating from Barnard College in 1976, she then won acceptance to the University of Pennsylvania’s medical school.

Hearing a speech on the cure of smallpox inspired Gayle to pursue public health medicine, and her direction would prove a significant one in the years to come as the plague of AIDS came to decimate communities across the globe. She received her M.D. from University of Pennsylvania as well as a master’s degree in public health from Johns Hopkins, both in 1981. After a residency in pediatrics, she was selected to enter the epidemiology training program in 1984 at the Centers for Disease Control (CDC) in Atlanta, Georgia, the nation’s top research center for infectious diseases.

For much of the 1980s, Gayle was intensely involved in the CDC’s research into AIDS and HIV infection through her work first in the center’s Epidemic Intelligence Service, and later as a chief of international AIDS research, a capacity in which she oversaw the scientific investigations of over 300 CDC researchers. Gayle has been instrumental in raising public awareness about the disease, and is especially driven to point out how devastating AIDS has been to the African American community. Sex education, better health care for the poor, and substance abuse prevention are some of the proposals Gayle has championed that she believes will help reduce deaths from AIDS.

In 1992, Gayle was hired as a medical epidemiologist and researcher for the AIDS division of the U.S. Agency for International Development, cementing her reputation as one of the international community’s top AIDS scientists. Gayle has served as director of CDC’s National Center for HIV, STD, and TB Prevention, and was named program director for HIV/AIDS and TB for the Bill and Melinda Gates Foundation in 2001. She was also named Assistant Surgeon General and Rear Admiral in the U.S. Public Health Service. Gayle was named president of CARE USA, a humanitarian organization dedicated to fighting poverty and disease, in 2006.

Gayle has been the recipient of a number of awards and honors, including the Women of Color, Health Science, and Technology Awards, Medical Leadership in Industry Award, in 2002.

EVELYN BOYD GRANVILLE (1924– ) Author, Educator, Lecturer

Born in Washington, D.C., on May 1, 1924, Granville attended Smith College from 1941 to 1946, and earned an A.B. and a M.A. in mathematics. She received a Ph.D. from Yale University in 1949, making her one of the first two African American women to be awarded a Ph.D. in pure mathematics.

Granville’s first teaching position was as an instructor at New York University (1949–1950). She then moved to Fisk University where she was an assistant professor

(1950–1952). From 1956 to 1960, Granville worked for IBM on the Project Vanguard and Project Mercury space programs, analyzing orbits and developing computer procedures. Over the next seven years, Granville held positions at the Computation and Data Reduction Center of the U.S. Space Technology Laboratories; the North American Aviation Space and Information Systems Division; and as a senior mathematician for IBM.

Granville returned to teaching in 1967, taking a position at California State University in Los Angeles and teaching there until her retirement in 1984. In addition to her role as a college level instructor, Granville worked to improve mathematics skills at all levels. From 1985 to 1988, Granville emerged from retirement to teach mathematics and computer science at Texas College in Tyler, Texas. In 1990, she accepted an appointment to the Sam A. Lindsey Chair at the University of Texas at Tyler, and in subsequent years, taught there as a visiting professor. Granville is the co-author of Theory of Applications of Math for Teachers (1975). She received an honorary doctorate from Smith College in 1989.

FREDERICK D. GREGORY (1941– ) Airplane Pilot, Astronaut

Gregory was born January 7, 1941, in Washington, D.C. He is the nephew of the late Dr. Charles Drew, noted blood plasma specialist. Under the sponsorship of United States Representative Adam Clayton Powell, Gregory attended the U.S. Air Force Academy and graduated with a B.S. in 1964. In 1977, he received an M.S.A. from George Washington University.

Gregory was a helicopter and fighter pilot for the USAF from 1965 to 1970, and a research and test pilot for the USAF and National Aeronautics and Space Administration (NASA) in 1971. In 1978, he was accepted into NASA’s astronaut program, making him the second African American astronaut in NASA’s history. In 1985, he went into space aboard the Challenger Space Shuttle as a pilot, a first for an African American. Gregory served with NASA’s Office of Safety and Mission Assurance until 2001, when he was named acting associate administrator for the Office of Space Flight and remained in that position until 2002, when he was appointed deputy administrator. In 2005, Gregory assumed the position of acting administrator of NASA, nominated by President George W. Bush and confirmed by the U.S. Senate.

Gregory, who retired from the Air Force with the rank of colonel, belongs to the Society of Experimental Test Pilots, the Tuskegee Airmen, the American Helicopter Society, and the National Technical Association. He has been awarded a number of honorary doctorates as well as civic and community honors. He has also won numerous medals and awards including the Meritorious Service Medal, the Air Force Commendation Medal, and two NASA Space Flight Medals. He has twice received the Distinguished Flying Cross. He is also the recipient of George Washington University’s Distinguished Alumni Award, NASA’s Outstanding Leadership Award, and the National Society of Black Engineers’ Distinguished National Scientist Award. In 2003, he received the Presidential Rank Award for Distinguished Executives, and in 2004 and 2005, he was designated one of the 50 Most Important Blacks in Technology.

LLOYD A. HALL (1894–1971) Research Director, Chemist

Grandson of the first pastor of Quinn Chapel A.M.E. Church, the first African American church in Chicago, Lloyd Augustus Hall was born in Elgin, Illinois, on June 20, 1894. A top student and athlete at East High School in Aurora, Illinois, he graduated in the top 10 of his class and was offered scholarships to four different colleges in Illinois. In 1916, Hall graduated from Northwestern University with a B.S. in chemistry. He continued his studies at the University of Chicago and the University of Illinois.

During World War I, Hall served as a lieutenant, inspecting explosives at a Wisconsin plant. After the war, Hall joined the Chicago Department of Health Laboratories, where he quickly rose to senior chemist. In 1921, he took employment at Boyer Chemical Laboratory. He became president and chemical director of the Chemical Products Corporation the following year. In 1924, he was offered a position with Griffith Laboratories. Within one year, he was chief chemist and director of research.

While there, Hall discovered curing salts for the preserving and processing of meats, thus revolutionizing the meat-packing industry. He also discovered how to sterilize spices and researched the effects of antioxidants on fats. Along the way, he registered more than 100 patents for processes used in the manufacturing and packing of food, especially meat and bakery products.

In 1954, Hall became chairman of the Chicago chapter of the American Institute of Chemists. The following year, he was elected a member of the national board of directors, becoming the first African American man to hold that position in the institute’s 32-year history. Upon his retirement from Griffith in 1959, Hall continued to serve as a consultant to various state and federal organizations. In 1961, he spent six months in Indonesia, advising the Food and Agricultural Organization of the United Nations. From 1962 to 1964, he was a member of the American Food for Peace Council, an appointment made by President John F. Kennedy.

MARC R. HANNAH (1956– ) Computer Scientist

Marc Regis Hannah, a native Chicagoan, was born on October 13, 1956. In high school, he took a computer science course that kindled his interest in this relatively new field. Also inspired by the example of an older brother, he earned high grades that would qualify him for a Bell Laboratories-sponsored scholarship to engineering school. He would eventually earn a Ph.D. from Stanford University in 1985.

While at Stanford, he met James Clark, an engineering professor who was a pioneer in computer graphics, having invented a special computer chip that was the heart of an imaging process. Hannah redesigned the chip to operate five times faster, an advance that impressed Clark enough to invite Hannah to join him in founding a computer graphics company—in 1981, Silicon Graphics, an international market leader in 3-D computer graphics, was born. Silicon Graphics’ technology has been used to enhance many devices, such as military flight simulators and medical computed axial tomography (CAT) scans. Among the most lucrative areas for this technology, and certainly the best known, is that of video and film animation. The special effects made possible by three-dimensional imaging in films such as Star Wars, Terminator II, and Jurassic Park. Hannah is vice president, chief scientist, and chief architect of the Personal IRIS, Indigo, Indigo2, and Indy graphics subsystems at the company.

MATTHEW A. HENSON (1866–1955) Seaman, Explorer/Surveyor, Author

Matthew Henson was born August 6, 1866, in Charles County, Maryland, near Washington, D.C. He attended school in Washington, D.C., for six years, but at the age of 13, signed on as a cabin boy on a ship headed for China. Henson worked his way up to seaman while he sailed over many of the world’s oceans. After several odd jobs in different cities, Henson met U.S. Navy surveyor Robert Edward Peary in Washington, D.C. Peary, who was planning a trip to Nicaragua, hired Henson on the spot as his valet. Henson was not pleased at being a personal servant, but nonetheless felt his new position held future opportunities.

Peary eventually made seven trips to the Arctic starting in 1893. He became convinced that he could become the first man to stand at the North Pole. Henson accompanied Peary on these trips to Greenland and became an integral part of Peary’s plans. The pair made four trips looking for a passageway to the North Pole. In 1909, Peary and Henson made their final attempt at reaching the Pole. Although Peary was undoubtedly the driving force of these expeditions, he was increasingly reliant on Henson. Henson’s greatest assets were his knowledge of the Inuit language and his ability to readily adapt to their culture. He was also an excellent dog driver, and possessed a physical stamina that Peary lacked due to leukemia. Henson felt that he was serving the African American race by his example of loyalty, fortitude, and trustworthiness.

By the end of March 1909, they were within 150 miles of their goal. Henson, because of his strength, would break trail and set up camp for the night, while Peary followed. On April 6, Henson thought he had reached the Pole; Peary arrived later to affirm the belief. Henson then had the honor of planting the U.S. flag.

In 1912, Henson wrote A Negro at the North Pole, but the book aroused little interest. By the 1930s, however, Henson began receiving recognition for his contributions to arctic exploration. In 1937, he was the first African American elected to the Explorers Club in New York. In 1944, he and other surviving members of the expedition received Congressional medals. In 1954, Henson received public recognition for his deeds from President Eisenhower. Henson died in 1955 and was buried in New York. In 1988, his remains were exhumed and buried with full military honors at Arlington National Cemetery, next to the grave of Robert Peary.

WILLIAM A. HINTON (1883–1959) Lecturer, Medical Researcher, Educator

Long one of the world’s authorities on venereal disease, Dr. William A. Hinton is responsible for the development of the Hinton test, a reliable method for detecting syphilis. He also collaborated with Dr. J. A. V. Davies on what is now called the Davies-Hinton test for the detection of this same disease.

Born in Chicago on December 15, 1883, Hinton graduated from Harvard in 1905. In 1912, he finished his medical studies in three years at Harvard Medical School. After graduation, he was a voluntary assistant in the pathological laboratory at Massachusetts General Hospital. This was followed by eight years of laboratory practice at the Boston Dispensary and at the Massachusetts Department of Public Health. In 1923, Hinton was appointed lecturer in preventive medicine and hygiene at Harvard Medical School, where he served for 27 years. In 1949, he was the first person of color to be granted a professorship there.

In 1931, at the Boston Dispensary, Hinton started a training school for poor girls so that they could become medical technicians. From these classes of volunteers grew one of the country’s leading institutions for the training of technicians. Though he lost a leg in an automobile

accident, Hinton remained active in teaching and at the Boston Dispensary Laboratory, which he directed from 1916 to 1952. He died in Canton, Massachusetts, on August 8, 1959.

SHIRLEY ANN JACKSON (1946– ) Lecturer, Physicist

Born in Washington, D.C., on August 5, 1946, Shirley Ann Jackson graduated as valedictorian of her class from Roosevelt High School in 1964. In 1968, she received a B.S. degree from Massachusetts Institute of Technology, where she was one of only two African American women in her undergraduate class. In 1973, she became the first African American woman in the United States to earn a Ph.D. in physics, which she also earned from Massachusetts Institute of Technology. She was later named a life member of the MIT Corporation, the institute’s board of trustees.

Jackson’s first position—as a research associate at the Fermi National Accelerator Laboratory in Batavia, Illinois—reflected her interest in the study of subatomic particles. Jackson has worked as a member of the technical staff on theoretical physics at AT&T Bell Laboratories, as a visiting scientist at the European Center for Nuclear Research in Geneva, and as a visiting lecturer at the NATO International Advanced Study Institute in Belgium.

In 1995, President Bill Clinton named Jackson as chair of the Nuclear Regulatory Commission (NRC). Under Jackson’s direction, the NRC became more aggressive about inspections and forced some top officials out of office because of their lax enforcement of safety regulations. Jackson has held the post of professor at Rutgers University and is active in many organizations including the National Academy of Sciences, the American Association for the Advancement of Science, and the National Science Foundation. In 2001, she was elected to the board of the Public Service Enterprise Group, just one day after being appointed a director of the AT&T Corporation. Jackson has also been named to seven boards of publicly traded companies since July 1999, when she became the eighteenth president of Rensselaer Polytechnic Institute.

Jackson has been awarded 40 honorary doctoral degrees and was inducted into the National Women’s Hall of Fame in 1998 for her work as a scientist and an advocate for education, science, and public policy. In 2004, Jackson was named one of seven 2004 fellows of the Association for Women in Science (AWIS), an organization dedicated to achieving equity and full participation of women in all areas of science and technology.

MAE C. JEMISON (1956– ) Physician/Surgeon

Mae Jemison was born October 17, 1956, in Decatur, Alabama, but her family moved to Chicago when she was three years old. She attended Stanford University on a National Achievement Scholarship and received a B.S. in chemical engineering and a B.A. in Afro-American studies in 1977. She then enrolled in Cornell University’s medical school and graduated in 1981. Her medical internship was at the Los Angeles County/University of Southern California Medical Center in 1982. She was a general practitioner with the INA/Ross Loos Medical Group in Los Angeles until 1983, followed by two years as a Peace Corps medical officer in Sierra Leone and Liberia. Returning to the United States in 1985, she began working for CIGNA Health Plans, a health maintenance organization in Los Angeles, and applied for admission into NASA’s astronaut program.

In 1987, Jemison was accepted in NASA’s astronaut program. Her first assignment was representing the astronaut office at the Kennedy Space Center in Cape

Canaveral, Florida. On September 12, 1992, Jemison became the first African American woman in space on the shuttle Endeavour. She served aboard the Endeavour as a science specialist. As a physician, she studied the effect of weightlessness on herself and other crew members. Jemison resigned from NASA in 1993 to pursue personal goals related to science education and health care in West Africa. In 1993, she was appointed to a Montgomery Fellowship at Dartmouth College, where she established the Jemison Institute for Advanced Technology in Developing Countries. That same year, she founded the Jemison Group, an advanced technologies research and consulting firm. In 1994, Jemison founded the International Science Camp in Chicago to help young people become enthusiastic about science.

In 1988, Jemison won the Science and Technology Award given by Essence magazine and in 1990 she was Gamma Sigma Gamma’s Woman of the Year. In 1991, she earned a Ph.D. from Lincoln University. She also served on the board of directors of the World Sickle Cell Foundation from 1990 to 1992. In 2001, she published a memoir for children called Find Where the Wind Goes.

FREDERICK M. JONES (1893–1961) Mechanic, Inventor

In 1935, Frederick McKinley Jones built the first automatic refrigeration system for long haul trucks. Later, the system was adapted to various other carriers including railway cars, ships, and trucks. Previously, foods were packed in ice so slight delays led to spoilage. Jones’ new method instigated a change in eating habits of the entire nation and allowed for the development of food production facilities in almost any geographic location. Refrigerated trucks were also used to preserve and ship blood products during World War II.

Jones was born in Kentucky in 1893. His mother left the family when he was a baby, and his father left him at age five to be raised by a priest until he was 16 years of age. There, Jones received a sixth grade education. When he left the rectory, he worked as a pin boy, mechanic’s assistant, and finally, as chief mechanic on a Minnesota farm. He served in World War I and, in the late 1920s, his mechanical fame spread when he developed a series of devices to adapt silent movie projectors into sound projectors.

Jones also developed an air conditioning unit for military field hospitals, a portable x-ray machine, and a refrigerator for military field kitchens. During his life, a total of 61 patents were issued in Jones’s name. He died in 1961.

PERCY L. JULIAN (1899–1975) Educator, Medical Researcher, Research Director

Born on April 11, 1899, in Montgomery, Alabama, Julian attended DePauw University in Greencastle, Indiana. He graduated Phi Beta Kappa at DePauw University and was valedictorian of his class after having lived during his college days in the attic of a fraternity house where he worked as a waiter. For several years, Julian taught at Fisk, West Virginia State College, and Howard University where he was associate professor and head of the chemistry department. He left to attend Harvard and the University of Vienna, where he earned a Ph.D. in 1931. Julian then continued his research and teaching duties at Howard.

In 1935, Julian synthesized the drug physostigmine, which is used today in the treatment of glaucoma. He later became director of research, chief chemist and did soybean research at the Glidden Company, where he specialized in the production of sterols, which he extracted from the oil of the soybean. The method perfected by Julian in 1950 eventually lowered the cost of sterols to less than 20 cents a gram and, ultimately, enabled millions of people suffering from arthritis to obtain relief through the use of cortisone, a sterol derivative. Later, Julian developed methods for manufacturing sex hormones from soya bean sterols: progesterone was used to prevent miscarriages, while testosterone was used to treat older men for diminishing sex drive. Both hormones were important in the treatment of cancer.

In 1954, after serving as director of research for the Glidden Company, he founded his own company, the Julian Laboratories, in Chicago and Mexico. Years later, the company was sold to Smith, Kline, and French. In 1947, Julian was awarded the Spingarn Medal and, in 1964, he founded Julian Institute and Julian Associates Incorporated in Franklin Park, Illinois. He was awarded the Chemical Pioneer Award by the American Institute of Chemists in 1968. Julian died on April 19, 1975.

ERNEST E. JUST (1883–1941) Editor, Zoologist, Marine Biologist

Born in Charleston, South Carolina, on August 14, 1883, Ernest Just received his B.A. in 1907 with high honors from Dartmouth and his Ph.D. in 1916 from the University of Chicago. His groundbreaking work on the embryology of marine invertebrates included research on fertilization—a process known as parthenogenesis—but his most important achievement was his discovery of the role protoplasm plays in the development of a cell.

Just began teaching at Howard University in 1907 and started graduate training at the Marine Biological Laboratory in Woods Hole, Massachusetts, in 1909. He performed most of his research at this site over the next 20 summers. Between 1912 and 1937, he published more than 50 papers on fertilization, parthenogenesis, cell division, and mutation. He also published a textbook in 1939 that was the result of his research in cell functioning and the structure and role of protoplasm within a cell.

A member of Phi Beta Kappa, Just received the Spingarn Medal in 1914 and served as associate editor of Physiological Zoology, The Biological Bulletin, and The Journal of Morphology. In 1930, Just was one of 12 zoologists to address the International Congress of Zoologists and he was elected vice president of the American Society of Zoologists. Ernest Just left the United States in 1929 because of racist attitudes that prevented his career from advancing. He died on October 27, 1941.

SAMUEL L. KOUNTZ (1930–1981) Physician/Surgeon, Medical Researcher

Born in 1930 in Lexa, Arkansas, Samuel Kountz graduated third in his class at the Agricultural, Mechanical and Normal College of Arkansas in 1952, having initially failed his entrance exams. He pursued graduate studies at the University of Arkansas, earning a degree in chemistry. Senator J. W. Fulbright, who Kountz met when he was a graduate student, advised Kountz to apply for a scholarship to medical school. Kountz won the scholarship on a competitive basis and was the first African American to enroll at the University of Arkansas Medical School in Little Rock, graduating with his M.D. in 1958. Kountz was responsible for finding out that large doses of the drug methylprednisolone could help reverse the acute rejection of a transplanted kidney. The drug was used for a number of years in the standard management of kidney transplant patients.

While he was still an intern, Kountz assisted in the first West Coast kidney transplant. In 1964, working with Dr. Roy Cohn, one of the pioneers in the field of transplantation, Kountz again made medical history by transplanting a kidney from a mother to a daughter—the first transplant between humans who were not identical twins. At the University of California in 1967, Dr. Kountz worked with other researchers to develop the prototype of a machine that is now able to preserve kidneys up to 50 hours from the time they are taken from the body of a donor. The machine, called the Belzer Kidney Perfusion Machine, was named for Dr. Folker O. Belzer, who was Kountz’s partner. Kountz went on to build one of the largest kidney transplant training and research centers in the nation. He died in 1981 after a long illness contracted on a trip to South Africa in 1977.

LEWIS H. LATIMER (1848–1928) Draftsperson, Electrical Engineer

Lewis Howard Latimer, a pioneer in the development of the electric light bulb, was employed by Alexander Graham Bell to make the patent drawings for the first telephone, and later he went on to become chief draftsman for both the General Electric and Westinghouse companies.

Born in Chelsea, Massachusetts, on September 4, 1848, and raised in Boston, Latimer enlisted in the Union Navy at the age of 15 and began studying drafting upon completion of his military service. In 1881, he invented a method of making carbon filaments for the Maxim electric incandescent lamp and later patented this method. He also supervised the installation of electric light in New York, Philadelphia, Montreal, and London for the Maxim-Weston Electric Company. In 1884, he joined the Edison Company.

Latimer died on December 11, 1928, in New York City.

THEODORE K. LAWLESS (1892–1971) Physician, Philanthropist

Theodore Kenneth Lawless was born on December 6, 1892, in Thibodeaux, Louisiana. He received his B.S. from Talladega College in 1914 and continued to further his education at the University of Kansas and Northwestern University, where he received his M.D. in 1919. He then pursued a master’s in dermatology, which he finished at Columbia University. From there he furthered his studies at Harvard University, the University of Paris, the University of Freiburg, and the University of Vienna.

Lawless started his own practice in the Chicago’s predominantly African American South Side upon his return in 1924, which he continued until his death in 1971. He soon became one of the premiere dermatologists in the country and earned great praise for researching treatments and cures for a variety of skin diseases including syphilis and leprosy. During the early years of his career, he taught dermatology at Northwestern University Medical School, where his research was instrumental in devising electropyrexia, a treatment for those suffering cases of syphilis in its early stages. Before he left his role at Northwestern in 1941, he aided in building the university’s first medical laboratories.

After leaving Northwestern, Lawless entered the business world beginning as president of 4213 South Michigan Corporation, which sold low-cost real estate, and later as president of the Service Federal Savings and Loan Association. By the 1960s, he was well-known as one of the 35 richest African American men in the United States.

During his lifetime, Lawless served on dozens of boards of directors and belonged to countless organizations. He served on the Chicago Board of Health, as senior attending physician at Provident hospital, as associate examiner in dermatology for the National Board of Medical Examiners, as chairman of the Division of Higher Education, and as consultant to the Geneva Community Hospital in Switzerland. He was also recognized with many awards for his exemplary breakthroughs in medicine, public service, and philanthropy including the Harmon Award in Medicine in 1929, the Churchman of the Year in 1952, the Springarn Medal from the NAACP in 1954, and the Daniel H. Burnham Award from Roosevelt University in 1963. He died in 1971.

ROBERT H. LAWRENCE JR. (1935–1967) Astronaut, Airplane Pilot

Air Force Major Robert H. Lawrence Jr. was the first African American astronaut to be appointed to the Manned Orbiting Laboratory. Lawrence was a native of Chicago, and while still in elementary school, he became a model airplane hobbyist and a chess enthusiast. Lawrence became interested in biology during his time at Englewood High School in Chicago. As a student at Englewood, Lawrence excelled in chemistry and track. When he graduated, he placed in the top 10% of the class.

Lawrence entered Bradley University, joining the Air Force Reserve Officer’s Training Corps and attaining the rank of lieutenant colonel, making him the second highest ranking cadet at Bradley. Lawrence was commissioned a second lieutenant in the United States Air Force in 1956 and soon after received his bachelor’s degree in chemistry. Following a stint at an air base in Germany, Lawrence entered Ohio State University through the Air Force Institute of Technology as a doctoral candidate, earning his Ph.D. in 1965. Lawrence’s career came to an end in 1967 when his F-104D Starfighter jet crashed on a runway in a California desert.

ELIJAH MCCOY (1843–1929) Inventor, Machinist

Born in 1843 in Canada, McCoy traveled to Scotland at age 16. There he was apprenticed to a master mechanic and engineer. After the Civil War, he moved to Ypsilanti, Michigan, where he sought work as an engineer. However, he was only able to obtain employment as a fireman and oiler for Michigan Central Railroad.

McCoy’s first invention was a lubricating cup that used steam pressure to drive oil into channels that brought it to a steam engine’s moving parts. It was patented in

1872. Before this invention, these parts had to be oiled at the car’s intermittent stops, slowing the pace of rail travel considerably. In addition, the automatic device kept the engine better lubricated than was possible with the old method. Variations of this cup came to be used on many types of heavy machinery. Although McCoy received at least 72 patents in his lifetime, little money would reach his pockets as a result of his ideas. Because he lacked the capital to invest in manufacturing, he sold most of his patents for modest sums of money while the manufacturers made millions. Later in his life, he helped found the Elijah McCoy Manufacturing Company, but he died just a few years later.

RONALD E. MCNAIR (1950–1986) Astronaut

Ronald McNair was born on October 21, 1950, in Lake City, South Carolina. He was a graduate of North Carolina A&T State University with a B.S. degree in physics. He also received a Ph.D. in Physics from Massachusetts Institute of Technology. He was presented an honorary Doctorate of Laws from North Carolina A&T in 1978.

McNair was working on the use of lasers in satellite communications when he was selected by NASA in 1978 to train as an astronaut. In August 1979, he completed a one-year training and evaluation period that made him eligible for assignment as mission specialist on Space Shuttle flight crews. He presented papers in the areas of lasers and molecular spectroscopy and gave many presentations in the United States and Europe. He was the second African American to orbit the earth on a NASA mission.

Despite the rigorous training in the NASA program, he taught karate at a church, played the saxophone, and found time to interact with young people. McNair was aboard the shuttle Challenger that exploded shortly after liftoff from Cape Kennedy and plunged into the waters off the Florida coast on January 28, 1986. The shuttle had a crew of seven persons including two women, a mission specialist, and a teacher-in-space participant.

WALTER E. MASSEY (1938– ) Physicist

Walter Eugene Massey was born in Hattiesburg, Mississippi, on April 5, 1938. At the end of the tenth grade, he accepted a scholarship to Morehouse College. He almost quit after a few weeks, but graduated four years later with a B.S. in physics. He completed his Ph.D. in physics in 1966.

Massey’s research interests have included solid state theory (study of properties of solid material) and theories of quantum liquids and solids. While still a graduate student, he studied the behavior of both solid and liquid helium-3 and helium-4, publishing a series of papers on this work in the early 1970s. He became a full professor at Brown University in 1975 and was named dean of the college in the same year. Massey’s best-known accomplishment at Brown was his development of the Inner City Teachers of Science (ICTOS) program, a program for the improvement of science instruction in inner city schools. He was awarded the American Association for the Advancement of Science’s Distinguished Service Citation for his development of ICTOS.

In 1979, the University of Chicago invited Massey to become professor of physics and director of the Argonne National Laboratory, which the university operates for the U.S. Department of Energy. The facility was beset by financial troubles at the time, and Massey has been credited with its successful recovery. In the fall of 1990, Massey was chosen by Pres. George H. W. Bush to head the National Science Foundation (NSF), a position he held until 1993. He was only the second African American to hold that post. Massey was president of Morehouse College from 1995 to 2007. He has also served on the board of several major corporations.

JAN MATZELIGER (1852–1889) Inventor, Shoemaker/Leather Worker

Born in 1852 in Paramaribo, Dutch Guiana, Matzeliger found employment in the government machine works at the age of 10. Nine years later, he left home and eventually immigrated to the United States, settling in Philadelphia, where he worked in a shoe factory. He later moved to New England, settling permanently in Lynn, Massachusetts, in 1877. The Industrial Revolution had by this time resulted in the invention of machines to cut, sew, and tack shoes, but none had been perfected to last a shoe, which involved stretching the leather over a model foot. Observing this, Matzeliger designed and patented a device, one which he refined over the years to a point where it could last the leather, arrange the leather over the sole, drive in the nails, and deliver the finished product—all within one minute.

Matzeliger’s patent was subsequently bought by Sydney W. Winslow, who established the United Shoe Machine Company. The continued success of this business brought about a 50% reduction in the price of shoes across the nation, doubled wages for unskilled workers, and improved working conditions for millions of people dependent on the shoe industry for their livelihood. Between 1883 and 1891, Matzeliger received five patents on his inventions, all which contributed to the shoemaking revolution. His last patent was issued in September 1891, two years after his death.

Matzeliger died of tuberculosis in 1889 at the age of 37, long before he had the chance to realize a share of the enormous profit derived from his invention. He never received any money. Instead, he was issued stock in the company that did not become valuable until after his demise.

GARRETT A. MORGAN (1877–1963) Inventor

Born in Paris, Kentucky, in 1877, Morgan moved to Cleveland at an early age. Although he was most famous for his invention of the gas inhalator, an early gas mask, he also invented an improvement on the sewing machine that he sold for $150, as well as a hair-refining cream that straightened human hair. The cream remained in use for over 40 years. In 1923, having established his reputation with the gas inhalator, he was able to command a price of $40,000 from the General Electric Company for his automatic traffic signal.

In 1912, Morgan developed his safety hood, a gas inhalator that was a precursor to the gas mask. The value of his invention was first acknowledged during a successful rescue operation of several men trapped by a tunnel explosion in the Cleveland Waterworks, some 200 feet below the surface of Lake Erie. During the emergency, Morgan, his brother, and two other volunteers were the only men able to descend into the smoky, gas-filled tunnel and save several workers from asphyxiation.

Orders for the Morgan inhalator soon began to pour into Cleveland from fire companies all over the nation, but as soon as Morgan’s racial identity became known, many of them were canceled. In the South, it was necessary for Morgan to utilize the services of a white man to demonstrate his invention. During World War I, the Morgan inhalator was transformed into a gas mask used by combat troops. Morgan died in 1963 in Cleveland—the city that had awarded him a gold medal for his devotion to public safety.

WAVERLY J. PERSON (1927– ) Geophysicist, Seismologist

Waverly J. Person, born in Blackenridge, Virginia, on May 1, 1927, is the first African American to hold the prominent position of director of the United States Geological Survey’s National Earthquake Information Center. A respected geophysicist and seismologist, he was also one of the first African Americans in his field. He also currently encourages minority students to consider the earth sciences as a career.

While a technician at the National Information Earthquake Center, he took up graduate studies. From 1962 to 1973, he held that position and simultaneously completed graduate work at American University and George Washington University. His supervisors increasingly assigned him more challenging tasks that he performed well, gaining notice among his peers. Soon, he was qualified as a geophysicist and transferred to the United States Geological Survey’s National Earthquake Information Center in Colorado. In 1977, Person was named director of the Colorado National Earthquake Information Center, and in 1994, was named director of the National Earthquake Information Center.

Person has been honored with many distinguished awards throughout his professional life. These include an Honorary Doctorate in Science from St. Paul’s College in 1988; Outstanding Government Communicator in 1988; Meritorious Service Award-United States Department Interior in 1989; and in 1990, the Annual Minority Award from the Community Services Department in Boulder, Colorado. His work at the National Earthquake Information Center has been praised by the U.S. Department of the Interior. Person is often called on as an earthquake spokesperson by national and international media.

NORBERT RILLIEUX (1806–1894) Inventor, Mechanical Engineer

Norbert Rillieux’s inventions were of great value to the sugar refining industry. The method formerly used called

for gangs of slaves to ladle boiling sugarcane juice from one kettle to another—a primitive process known as the “Jamaica Train.” In 1845, Rillieux invented a vacuum evaporating pan (a series of condensing coils in vacuum chambers) that reduced the industry’s dependence on gang labor, and helped manufacture a superior product at a greatly reduced cost. The first Rillieux evaporator was installed at Myrtle Grove Plantation, Louisiana, in 1845. In the following years, factories in Louisiana, Cuba, and Mexico converted to the Rillieux system.

A native of New Orleans, Rillieux was the son of Vincent Rillieux, a wealthy engineer, and Constance Vivant, a slave on his plantation. Young Rillieux’s higher education was obtained in Paris, where his extraordinary aptitude for engineering led to his appointment at the age of 24 as an instructor of applied mechanics at L’Ecole Centrale. Rillieux returned to Paris permanently in 1854, securing a scholarship and working on the deciphering of hieroglyphics.

When his evaporator process was finally adopted in Europe, he returned to inventing with renewed interest—applying his process to the sugar beet. In so doing, he cut production and refining costs in half.

Rillieux died in Paris on October 8, 1894, leaving behind a system that is in universal use throughout the sugar industry, as well as in the manufacture of soap, gelatin, glue, and many other products.

MABEL K. STAUPERS (1890–1988) Former National Association of Colored Graduate Nurses President, Nurse, Civil Rights Advocate

As president of the National Association of Colored Graduate Nurses, Mabel Keaton Staupers led a successful drive to integrate the mainstream nursing profession and to end segregation in the U.S. Armed Forces Nurse Corps in World War II.

Staupers was born in Barbados, West Indies, on February 27, 1890, and migrated to the United States in 1903, settling in Harlem. She began her nursing education in 1914 at Freedmen’s Hospital School of Nursing (now known as Howard University College of Nursing) in Washington, D.C., and in 1917 graduated with class honors. After graduation, she began private-duty nursing until, with the assistance of physicians Louis T. Wright and James Wilson, she helped to found the Booker T. Washington Sanitarium in Harlem in 1922. This served as Harlem’s first inpatient center for African American patients with tuberculosis.

With a working fellowship, Staupers spent time at Jefferson Hospital Medical College in Philadelphia, then conducted a survey of health needs in Harlem for the New York Tuberculosis Association. She identified the health care problems of minorities, leading to the establishment of the Harlem Committee of the New York Tuberculosis and Health Association. Ultimately, she served 12 years as executive secretary.

During the 1930s and 1940s, Staupers worked closely with association president Estelle Massey Riddle in a fight to integrate African American nurses into the mainstream of nursing. On January 20, 1945, the surgeon general of the U.S. Army announced that race would no longer be a factor in accepting nurses into the Army Nurse Corps. The U.S. Navy followed five days later by integrating the Navy Nurse Corps.

Later, Staupers fought to end the racial barriers of the American Nurses’ Association (ANA); in 1948, its House of Delegates opened the organization to African American members. By 1949, however, Staupers persuaded the NACGN’s members that the organization had realized its goals and was now obsolete. The organization’s convention in that year voted to dissolve the organization, and Staupers presided over the formal dissolution.

In recognition of her leadership and efforts to remove racial barriers for African American women in the military and the ANA, Staupers was widely honored. Among her honors was the Spingarn Medal, which she received from the National Association for the Advancement of Colored People in 1951. She recorded the plight of African American nurses in her book, No Time for Prejudice: A Story of the Integration of Negroes in the United States. Staupers died on November 29, 1989.

LEWIS TEMPLE (1800–1854) Inventor

The toggle iron harpoon—the standard harpoon used in American whaling from the mid-nineteenth through the early twentieth centuries—was invented by Lewis Temple. This harpoon, which had a moveable head that prevented a whale from slipping loose, was an improvement over the barbed head harpoon, led to a doubling of the annual catch.

Little is known of Temple’s early background, except that he was born a slave in Richmond, Virginia, in 1800 and had no formal education. He obtained his freedom and as a young man moved to New Bedford, Massachusetts, then a major New England whaling port. Finding work as a metal smith, Temple modified the design of the whaler’s harpoon and, in 1848, manufactured a new version of the harpoon with a barbed and pivoting head, making it much harder for a harpooned whale to escape. Using the toggle harpoon, the whaling industry soon entered a period of unprecedented prosperity. Temple, who never patented his harpoon, accidentally fell and never completely recovered from his injuries. He died in May 1854, destitute.

VIVIEN THOMAS (1910–1985) Surgical Research Technician

Born in Nashville, Tennessee, in 1910, Vivien Thomas had dreamed of a career as a physician since childhood. As a teenager, he worked as a carpenter and as an orderly to earn money for college, and enrolled in Tennessee Agricultural and Industrial College in 1929. The stock market crash later that year eradicated Thomas’ savings, and he was forced to quit school.

The following year, he was hired for a research assistant post at Vanderbilt University Medical School; he would become trauma researcher and surgeon Alfred Blalock’s assistant. For the next decade, Thomas worked long hours in the lab, conducting medical experiments for Blalock that eventually led to lifesaving advances in medicine during World War II, especially in the use of blood transfusions.

When Blalock was hired by the prestigious medical school at Johns Hopkins University in 1940, he would accept the post only if they hired Thomas as well. One of their most significant achievements together was a surgical procedure that restructured the blood vessels around an infant’s heart if the child was in danger of death due to poor circulation of blood into the lungs.

Thomas became a well-known and well-regarded figure on the campus of Johns Hopkins. He remained at the institution even after his mentor passed away in 1964, and in 1971, was honored by graduates of its medical school for his achievements. He became a medical school faculty member in 1977, and received an honorary degree in 1976. He retired in 1979. Thomas passed away in 1985, the same year a recounting of his life was published titled Pioneering Research in Surgical Shock and Cardiovascular Surgery: Vivien Thomas and His Work with Alfred Blalock.

MARGARET E. M. TOLBERT (1943– ) Analytical Chemist

Margaret Ellen Mayo Tolbert was born on November 24, 1943, to J. Clifton Mayo and Martha Artis Mayo in Suffolk, Virginia. The third of six children, Tolbert was still young when her parents separated and, shortly thereafter, her mother died. The six children were cared for by various neighbors and friends until they moved in with their paternal grandmother, Fannie Mae Johnson Mayo. Margaret Tolbert earned her undergraduate degree from Tuskegee University, and then obtained an M.S. in analytical chemistry in one year of study at Wayne State University. In 1970, she was recruited to join the doctoral program in chemistry at Brown University. Her research on biochemical reactions in liver cells was partially funded by a scholarship from the Southern Fellowship Fund.

In 1979, Tolbert spent five months in Brussels, Belgium, studying how different drugs are metabolized in rat liver cells at the International Institute of Cellular and Molecular Pathology. After her return to the United States, she was appointed director of the Carver Research Foundation. During her tenure, Tolbert was able to bring several large scientific research contracts to the university from the federal government—contracts that expanded the research capabilities of the entire school. From 1990 to 1993, Tolbert directed the Research Improvement in Minority Institutions Program for the National Science Foundation, which works to strengthen the infrastructure of research programs at minority colleges and universities.

In 1996, Tolbert was appointed director of the New Brunswick Laboratory at Argonne National Laboratories. She served in that position until 2002. As only the third director in the laboratory’s almost 50-year history, Tolbert’s position allowed her to help the entire country by enhancing nuclear security nationally as well as support international nonproliferation efforts. In addition, Tolbert was elected a fellow of the American Association for the Advancement of Science in 1998. She is a member of Sigma Xi, the American Chemical Society, the Organization of Black Scientists, and the American Association of University Women. She has been the recipient of numerous awards and honors, including the Women of Color in Government and Defense Technology Award in Managerial Leadership (2001).

Tolbert organized the U.S. Department of Energy’s Science Education Directors Council in 1994, and served as the council’s chair in 1995–1996. Involved in educational outreach throughout her career, in 2002, Tolbert accepted a position as senior-level spokesperson at the National Science Foundation. In this position, her function is to promote the foundation’s efforts to increase the participation of women, underrepresented minorities, and persons with disabilities in science.

OMAR WASOW (1970– ) Computer Programmer, Entrepreneur

Born in Brooklyn, New York, in 1970, Omar Wasow received his B.A. in Race and Ethnic Relations from Stanford University. Wasow is an Internet analyst for NewsChannel 4 in New York. His reports appear on the station’s various newscasts, and he is a frequent contributor to the station’s Web site. He also serves as the Internet analyst for MSNBC and National Public Radio. In addition, Wasow is the executive director of Black-Planet.com at Community Connect Inc., a Web site that facilitates online community among African Americans.

Wasow has become a leading commentator on the challenges and opportunities of new media and the new economy. The founder of New York Online, he was tagged by the New York Times as a “pioneer in Silicon Alley,” Newsweek magazine as one of the “50 most influential people to watch in cyberspace,” one of the “Silicon Alley Top 10” by the Village Voice, and “one of 50 to watch in 1999” by A-List magazine.

In 1993, Wasow predicted a shift in online demographics from hackers and academics to mainstream users, and rapidly produced the widely admired local online community New York Online. As his reputation grew, corporate clients like VIBE, Essence, Consumer Reports, Latina Magazine, The New Yorker, United Artists, and Samsung retained his company’s expertise to assist them in launching successful Internet ventures of their own.

Wasow is also a member of the Board of Contributors of USA Today, and writes an Internet business column for FeedMag.com. Active in a number of social issues, particularly school reform, Wasow is the co-chair of The Coalition for Independent Public Charter Schools. In that capacity, he helped push New York State to pass its recent breakthrough Charter Legislation. In 2003, the Brooklyn Excelsior Charter School, a K-6 he helped found, opened.

He is also a member of several nonprofit boards, including the New York Software Industry Association, WorldStudio, and The Refugee Project. As a result of Wasow’s longstanding commitment to civic participation, he was selected to be a fellow in the Rockefeller Foundation’s Next Generation Leadership program.

Wasow is pursuing a doctorate in African American studies and political science at Harvard.

LEVI WATKINS JR. (1945– ) Surgeon, Educator

Levi Watkins was born in Parsons, Kansas, in 1945, but his father, a college professor, moved the family to Alabama for a job with Alabama State University. Watkins grew up in Montgomery, Alabama, where through his involvement in local churches he became acquainted with civil rights leaders Dr. Ralph David Abernathy and the Rev. Martin Luther King Jr. Both were prominent members of the Montgomery community, as was Watkins’ own father, a college professor. The teenager’s participation in civil rights issues did not stop him from excelling academically, as he graduated as valedictorian of his high school class and went on to earn a 1966 honors degree from Tennessee State University.

Watkins’ awareness of issues of racial inequality led him to apply to Vanderbilt University Medical School, and he first learned of his acceptance as its first African American student by reading the newspaper headline announcing the breakthrough. He graduated in 1970, and began his internship and surgical training at the prestigious medical school at Johns Hopkins University. Watkins also studied at Harvard University Medical School for a time, and conducted research that led to the lifesaving practice of prescribing angiotensin blockers for patients susceptible to heart failure.

In 1978, Watkins became Johns Hopkins’ first African American chief resident in cardiac surgery, and became a faculty member that year as well. Two years later, he made medical history with the first successful surgical implantation of an AID (automatic implantable defibrillator) device, which has been credited with saving countless lives by its ability to restore a normal heartbeat during an attack of arrhythmia. In 1991, he became a full professor of cardiac surgery at Johns Hopkins, another first for the institution, and was named dean for postdoctoral programs and faculty development. For several years, however, Watkins had been working to increase minority presence at this elite medical school, and he instituted a special minority recruiting drive when he was appointed to the medical school’s admissions committee in 1979.

Watkins has been the recipient of numerous honors and awards, and in 2000, he received national recognition from the Guidant Corporation for his pioneering work on the automatic defibrillator.

DANIEL H. WILLIAMS (1856–1931) Surgeon/Physician

A pioneer in open heart surgery, Daniel Hale Williams was born in Hollidaysburg, Pennsylvania, on January 18, 1856. In 1878, he apprenticed to a prominent physician, which gave him the training to enter the Chicago Medical College in 1883.

Williams opened his office on Chicago’s South Side at a time when Chicago hospitals did not allow African American doctors to use their facilities. In 1891, Williams founded Provident Hospital, which was open to patients of all races. At Provident Hospital on July 10, 1893, Williams performed the operation upon which his later fame rests. A patient was admitted to the emergency ward with a knife wound in the pericardium, or the membrane enclosing the heart. With the aid of six staff surgeons, Williams made an incision in the patient’s chest and successfully repaired the tear. The patient fully recovered and was soon able to leave the hospital.

In 1894, President Cleveland appointed Williams surgeon-in-chief of Freedmen’s Hospital in Washington, D.C. He completely reorganized and updated procedures at the hospital, adding specialty departments, organizing a system of horse-drawn ambulances, and initiating more sanitary medical practices. After some political infighting at Freedmen’s, he resigned his post in 1897 to return to Provident.

Williams was instrumental in the forming of the Medico-Chirurgical Society and the National Medical Association. In 1913, he was inducted into the American Board of Surgery at its first convention. Over the course of his career, Williams helped establish over 40 hospitals in 20 states to serve African American communities. He died on August 4, 1931, after a lifetime devoted to his two main interests—the NAACP and the construction of hospitals and training schools for African American doctors and nurses.

O. S. WILLIAMS (1921– ) Aeronautical Engineer

Oswald S. “Ozzie” Williams was born on September 2, 1921, in Washington, D.C., but he was raised in New York City, graduating from Boys High School in Brooklyn in 1938. He became interested in engineering as a teenager. He attended New York University and was the second African American to receive a degree in aeronautical engineering in 1943, and he earned his master’s in the field in 1947.

In 1950, Williams took an engineering position at Greer Hydraulics, Inc. There he was responsible for the development of the first experimental airborne radio beacon, which was used to locate crashed airplanes. However, it was never produced commercially. At Grumman International, where he was hired as a propulsion engineer in 1961, Williams managed the development of the Apollo Lunar Module reaction control subsystem. He was fully responsible for the $42 million effort for eight years. He managed the three engineering groups that developed the small rocket motors that guided the lunar module, the part of the Apollo spacecraft that actually landed on the moon. Williams went on to a career in marketing at Grumman, culminating in his election as a company vice president in 1974.

After leaving Grumman, Williams taught marketing at St. John’s University in Queens, New York, where he completed an M.B.A. in 1981.

Williams was a member of the American Institute of Aeronautics and Astronautics and an associate fellow and chair of the institute’s Liquid Rockets Technical Committee.

GRANVILLE T. WOODS (1856–1910) Electrical engineer, Inventor

Sometimes referred to as the “Black Edison” because of his important electrical inventions, Granville T. Woods was born in Columbus, Ohio, on April 23, 1856. He attended school in Columbus until he was 10 years old, but was forced to leave school. Woods then began learning on the job as a machine shop apprentice and later became a machinist and blacksmith. As a young man, Woods attended night school and studied privately; he understood the importance of education and training in achieving his goals.

In 1872, Woods worked as a fireman on the Danville and Southern railroad in Missouri and later became an engineer. In his free time, he studied electronics. In 1874, he moved to Springfield, Illinois, and worked in a mill in which iron and steel were rolled into plates and bars. He took a job aboard a British steamer in 1878 and was promoted to chief engineer of the steamer within two years.

Woods eventually settled in Cincinnati, Ohio. In 1887, he patented a railway telegraph system that allowed moving trains to communicate with the station and with other trains in the area. Thus, it was possible to pinpoint the locations of trains between stations and avoid accidents and collisions. Thomas Edison later sued Woods, claiming that he was the first inventor of this “multiplex telegraph” system. After Woods won the suit, Edison offered Woods a prominent position in the engineering department at the Edison Electric Light Company in New York, but Woods rejected his offer. Alexander Graham Bell’s company purchased the rights to Woods’ telegraph system, thus allowing Woods the means to pursue inventing full-time.

In 1888, Woods developed an overhead system of electrical railway lines; this invention led to the overhead railroad system found in major cities such as Chicago, St. Louis, and New York City.

Woods became interested in thermal power and steam-driven engines, and in 1889, he filed his first patent for an improved steam boiler furnace.

During his illustrious career, Woods patented over 35 electrical inventions, which, in addition to Bell Telephone, were sold to major companies such as General Electric and Westinghouse. His automatic air brake, used to slow or stop trains, cut down on train accidents and was just one of the inventions that made the railways safer.

Woods spent the last years of his life in court battles, attempting to gain control over his own inventions. On January 30, 1910, he died in New York City, in near poverty.

Science and Technology

views updated May 29 2018

Science and Technology

Ronald E. Doel and

Zuoyue Wang

Science did not become a major concern of U.S. foreign policy until the twentieth century. This is not to say that science was unimportant to the young republic. U.S. leaders recognized that, in the Age of Reason, the prestige of science was part of the rivalry between nations. Yet through the nineteenth century science was primarily linked to foreign policy as an adjunct of trade relations or military exploration. By contrast, mechanical ability was central to the identity of Americans, and debates about the proper role of technology in American relations to Britain and Europe raged through the late nineteenth century, as the United States gained worldwide recognition for creating the modern technological nation.

Technologyand enthusiasm for technical solutions to social problemsremained important in American foreign relations through the twentieth century. But its position relative to science changed markedly after 1900. By the start of World War II, science became a new and urgent topic for policymakers, inspiring an uneasy relationship that profoundly challenged both diplomats and scientists. As the Cold War began, the U.S. government funded new institutions and programs that linked science with diplomatic efforts and national security aims. Some were cloaked in secrecy; others were incorporated into major foreign aid efforts such as the Marshall Plan. By the late twentieth century, policymakers viewed science and technology as synergistic twins, significant yet often unpredictable agents of economic, political, and social change on both national and global scales.

THE EARLY REPUBLIC

In the earliest years of the American Republic, the ideas of natural philosophy informed the world-view of the framers of the American Constitution. The most educated of them, including Thomas Jefferson, James Madison, and Benjamin Franklin, were familiar with the ordered clockwork universe that the greatest of Enlightenment scientists, Isaac Newton, had created, and metaphors and analogies drawn from the sciences permeated their political discourse. But the pursuit and practice of science was seen as part of a transnational "Republic of Letters," above the petty politics of nations. When a group of Harvard scientists sought to observe an eclipse in Maine's Penobscot Bay at the height of the revolutionary war in 1780, British forces not only tolerated them but provided safe passage. Similarly, while Franklin was a singularly well-known scientist, widely revered in France as the founder of the science of electricity, he served as the new nation's emissary to Paris on account of his similarly impressive skills in diplomacy and familiarity with French centers of power. While a number of institutions responsible for scientific research emerged within several decades after the nation's founding, including the Coast and Geodetic Survey and the Naval Observatory, none dealt directly with areas of national policy. Alexis de Tocqueville over-looked significant pockets of learning when he declared in Democracy in America (1835) that "hardly anyone in the United States devotes himself to the essentially theoretical and abstract portion of human knowledge," but he was astute in observing that the "purely practical part of science"applied technologywas what stirred the American imagination.

Still, adroit statesmen recognized that the apolitical "republic of science" could be a helpful tool in aiding foreign policy ambitions, a value connected with scientific research that would grow dramatically in later years. Exploration and geographic knowledge were important elements in contests for empire, and the nascent United States did support several successful exploring expeditions prior to the mid-nineteenth century. When President Thomas Jefferson sought to send Meriwether Lewis and William Clark on an expedition to the Pacific northwest, but lacked funds to provide military escort, he asked whether the Spanish minister would object to travelers exploring the Missouri River with "no other view than the advancement of geography." But in his secret message to Congress in January 1803, Jefferson emphasized the value the Lewis and Clark expedition would have in aiding United States control over this vast territory. By insisting that Lewis and Clark make careful astronomical and meteorological observations, study natural history, and record Indian contacts, Jefferson underscored an important relationship between science and imperialism. A similar set of concerns motivated the U.S. Exploring Expedition (Wilkes Expedition), which between 1838 and 1842 visited Brazil, Tierra del Fuego, Chile, Australia, and the East Indies and skirted 1,500 miles of the Antarctic ice pack (providing the first sighting of the Antarctic continent). Pressure to fund the expedition had come from concerned commercial and military groups, including whalers, who saw the Pacific as important for American interests. They did not sail empty waters, for this U.S. expedition over-lapped with the voyages of the Beagle, the Antarctic expedition of Sir James Clark Ross of England, and the southern survey by Dumont d'Urville of France, and thus owed to nationalistic as well as scientific rivalries. Yet government-sponsored expeditions in this era remained infrequent.

By contrast, technological concerns were very much on the minds of American leaders. The industrial revolution was well underway in Britain at the time of the American Revolution. Stimulated by the depletion of forests by the early eighteenth century as wood was consumed for fuel, Britain had developed coal as an alternative energy source, accelerating technological development through the steam engine (the crucial invention of the first industrial revolution) and the construction of water-and steam-powered mills. By the time of the American Revolution, British industries were supplying the American colonies with manufactured goods, spun cloth, textiles, and iron implements employed in farming. The former colonists' victory created a dilemma for the newly independent states, as Britain sought to forbid the export of machines or even descriptions of them to maintain its trading advantage. While the war in fact only temporarily cut off the United States from the output of the burgeoning industrial mills in Birmingham, Manchester, and London, and resumed migration after the war allowed mechanics to transfer technical knowledge across the Atlantic, government leaders still faced the question of what kind of material society the United States would attempt to create.

Americans at the turn of the nineteenth century agreed on one matter: they did not wish the United States to acquire the "dark satanic mills" that had made Manchester and Birmingham grimy, filthy cities, with overflowing sewers, wretched working conditions, widespread disease, and choking smoke. But American leaders also realized that a rejection of mill technology raised fundamental questions about what standards of material comfort the United States would aspire to reach, and the means, domestic and foreign, it would need to adopt to achieve those ends. Since sources of power were needed to increase living standards, how and what ways the former colonies would develop means of production or acquire finished products would help to shape the future economic, political, and social structure of the nation.

The question of whether to import the factory system to America or to encourage the growth of the United States as an agrarian nation emerged as the initial critical struggle over the role of technology in American foreign policy. It fanned intense political passions in the nascent nation, and helped shape its first political parties. Thomas Jefferson favored limiting the import of technological systems and manufactured goods. Jefferson wanted a republic primarily composed of small farmers, who as independent landowners would enhance "genuine and substantial virtue." The growth of large cities, he feared, would lead to a privileged, capitalistic aristocracy and a deprived proletariat. Jefferson's vision of an agrarian republic represented an ideal in early American political thought, popularized by such works as Hector St. John de Crevecouer's Letters from an American Farmer in 1782. While Jefferson was not adverse to all forms of manufacturing and would later soften his opposition to it even more, he initially envisioned a republic in which American families produced needed textiles at home and traded America's natural resources and agricultural output to secure plows and other essential artifacts. His foreign policy thus sought autonomy at the cost of more limited energy production and a lower standard of living.

Opposition to Jefferson's vision came from Alexander Hamilton, the New York lawyer and protégé of President George Washington who served as the young nation's first secretary of the treasury. Hamilton favored a diversified capitalistic economy, backed by a strong central government and import tariffs designed to nurture fledgling American industries. In his influential Report on Manufactures in 1791, Hamilton argued that "The Employment of Machinery forms an item of great importance in the general mass of national industry." Fearing a lack of social order from over-reliance on an agricultural economy, Hamilton declared that the development of industry would encourage immigration, make better use of the diverse talents of individuals, promote more entrepreneurial activity, and create more robust markets for agricultural products. Hamilton's prescription for nationalism and his support for technology gained favor from Franklin, Washington, and John Adams, although fears of Jeffersonian Republicans that virtue followed the plow still held sway among many Americans.

By the 1830s and 1840s, Hamilton's ideas had gained the upper hand, and the federal government became a firm supporter of technological development as a promising means to promote national prosperity. Jefferson's embargo of 1807 and the War of 1812, which illuminated the vulnerability of relying on Britain for manufactured goods, helped spur this development, but another critical factor was American success in developing technologies that increased agricultural output, including the invention of the cotton gin and the mechanical harvester. The abundance of powerful rivers in New England allowed manufacturers to develop textile mills that relied on water power, initially allowing new manufacturing centers like Lowell, Massachusetts, to avoid the industrial grime of Manchester. No less important, the rapid advance of canals, river boat transportation, and especially railroads provided a model for the integration of hinter regions and seat of the nation, a means for insuring economic development and the sale of manufactured goods and products to foreign markets. For many, like the influential legislator William Seward, technology was the key to securing American domination over the continent and advancing trade. After Seward helped reinterpret patent law to insure that U.S. inventors would profit from their creations, patent numbers swelled. Patents granted rose from an average of 646 per year in the 1840s to 2,525 in the 1850s. Dreams of a global commercial empire were similarly behind American efforts to open Japan to U.S. trading after 1852, as Japan possessed the coal needed by steamships bound to ports in China. These arguments became an enduring component of American perceptions about its global role, finding expression in Alfred T. Mahan's influential late nineteenth-century work on the influence of sea power on history.

Events in the middle decades of the nineteenth century reinforced American acceptance of technology as central to national progress. U.S. manufacturing advantages became even more evident after the invention of the sewing machine and Charles Goodyear's patenting of a process to vulcanize rubber in 1844. The invention of the telegraph encouraged additional trade and opened new markets, and citizens heralded the completion of the first transcontinental telegraph cables in 1861 as a new chapter in establishing an American identity. Already ten years earlier, Americans had delighted at the positive reception British and European observers gave to U.S. built technological artifacts exhibited at the Crystal Palace exhibition in London. The Civil War forcefully focused national attention on the production of guns and steel, but even before the war American citizens had become convinced of the value of embracing new technological systems. National desires to develop a transcontinental railroad were sufficient to overcome nativist American attitudes toward foreign labor and open the doors to the over 12,000 Chinese laborers who completed laying Central Pacific track to create the first transcontinental railroad. By the time the Centennial International Exhibit opened in Philadelphia in 1876, visitors flocking to Machinery Hall were already convinced, as Seward had argued in 1844, that technology aided nationalism, centralization, and dreams of imperialistic expansion.

THE SECOND INDUSTRIAL REVOLUTION AND THE PROGRESSIVE ERA

Three closely related factorsindustrialism, nationalism, and imperialismsoon combined to reinforce American enthusiasm for technology as a key element of national policy. By the end of the nineteenth century, the first industrial revolution (begun in England and concerned with adding steam power to manufacturing) yielded to a larger, globally oriented second industrial revolution, linked to broader systems of technological production and to imperialistic practice. In contrast to the first industrial revolution, which was regional and primarily affected manufacturers and urban dwellers, the second industrial revolution introduced mass-produced goods into an increasingly technologically dependent and international market. The rise of mass-produced sewing machines, automobiles, electrical lighting systems, and communications marked a profound transformation of methods of production and economics, becoming a major contributor to national economies in America and its European competitors. Manufacturing in the United States steadily climbed while the percentage of Americans working in agriculture declined from 84 percent in 1800 to less than 40 percent in 1900.

The second industrial revolution caused three important changes in the way Americans thought about the world and the best ways they could achieve national goals. First, the process of rapid industrialism brought about a heightened standard of living for many Americans, creating for the first time a distinct middle class. By the turn of the twentieth century, the architects of the interlocked technological systems that had made the United States an economic powerhousefrom the steel magnate Andrew Carnegie to the oil baron John D. Rockefeller and the inventor and electrical systems creator Thomas Alva Edisonwere increasingly represented in Washington, and their concerns helped shape foreign policy discussions. Second, and closely related, industrialization heightened an emerging sense of national identity and professionalization among citizens in the leading industrialized nations. The rise of nationalism was fueled not only by the technologies that these system builders created, but by other technologies and systems that rose with them, including low-cost mass-circulation newspapers, recordings of popular songs and national anthems, and public schools designed to instill in pupils the work ethic and social structure of the modern factory. The late nineteenth century was also the time that national and international scientific societies were created. American science was growing through the increasing numbers of young scientists who flocked to European universities to earn their Ph.D.s, carrying home a wealth of international contacts and commitments to higher standards. It was no coincidence that the rise of professional scientific communities paralleled the expanding middle class, as both groups found common support in the expansion of land-grant and private universities and in the industrial opportunities that awaited graduates of those universities. These new networks crystallized swiftly: they included the American Chemical Society (1876), the International Congress of Physiological Sciences (1889), the American Astronomical Society (1899), and the International Association of Academies (1899). The American Physical Society (1899) was founded two years before the federal government created the National Bureau of Standards, reflecting growing concerns from industrialists about creating international standards for manufacture.

Finally, the rise of advanced capitalist economies came to split the globe into "advanced" and "backward" regions, creating a distinct group of industrial nations linked to myriad colonial dependencies. Between 1880 and 1914 most of the Earth's surface was partitioned into territories ruled by the imperial powers, an arrangement precipitated by strategic, economic, and trade needs of these modern states, including the securing of raw materials such as rubber, timber, and petroleum. By the early 1900s, Africa was split entirely between Britain, France, Germany, Belgium, Portugal, and Spain, while Britain acquired significant parts of the East Asian subcontinent, including India. The demands of modern technological systems both promoted and reinforced these changes. The British navy launched the HMS Dreadnought in 1906, a super-battleship with greater speed and firing range than any other vessel, to help maintain its national edge and competitive standing among its trade routes and partners, while imperialistic relations were maintained by technological disparities in small-bore weapons. One was a rapid-fire machine gun invented by Sir Hiram Maxim, adapted by British and European armies after the late 1880s. Its role in the emerging arms race of the late nineteenth century was summed in an oftrepeated line of doggerel: "Whatever happens we have got/The Maxim gun and they have not."

The American experience in imperialism was less extensive than that of the leading European industrial nations, but nonetheless marked a striking shift from its earlier foreign policy. Until the early 1890s American diplomatic policy favored keeping the nation out of entangling alliances, and the United States had no overseas possessions. But by 1894 the United States came to administer the islands of Hawaii, and after the Spanish-American War of 1898 gained possession of (and later annexed) the Philippines. The story of America's beginnings as an imperial power has often been told, but the significance of technology and technological systems as a central factor in this development is not well appreciated. It is perhaps easier to see in the U.S. acquisition of the Panama Canal Zone in 1903. President Theodore Roosevelt and other American leaders recognized how an American-controlled canal would enhance its trade and strategic standing within the Pacific; they also had little doubt that U.S. industrialists and systems builders could construct it. A widely published photograph from that time reveals Roosevelt seated behind the controls of a massive earthmover in the Canal Zone. This single technological artifact served as an apt metaphor for the far larger technological system that turn-of-the-century Americans took great pride in creating.

World War Ia global conflict sparked by the clashing nationalistic aims of leading imperialist nationspulled scientists and engineers further into the realm of diplomacy. While scientists continued to insist on the apolitical character of science, publication of a highly nationalistic defense of the German invasion of Serbia by leading German scientists in 1914 had left that ideal in tatters. More important, perhaps, was how the war educated Americans about its emerging role as a premier technological nation, and the importance of maintaining adequate sources of petroleum. After 1918, U.S. firms gained Germany's treasured chemical patents as war reparations, expanding American domination of textiles and the petrochemical industries. Americans also found that the leaders of the Russian revolution of 1917, Vladimir Lenin and Leon Trotsky, coveted American machinery and the American system of production to build the Soviet republic. By 1929 the Ford Motor Company had signed agreements with Moscow to build thousands of Ford autos and trucks, and Soviet authorities sought to adapt the management principles of Frederick Winslow Taylor in a Russian version of Taylorism.

The widening intersection between science, technology, and foreign relations was not limited entirely to contests between the United States and other imperialist powers. In the Progressive Era, biologists began to urge diplomats to aid efforts to preserve threatened species whose migrations took them across international boundaries. While efforts to ameliorate overfishing in the boundary waters separating the United States and Canada and seal hunting in the Bering Sea in the early 1890s amounted to little, a strong campaign to aid songbird populations resulted in the Migratory Bird Act of 1918 between the United States and Great Britain (on behalf of Canada), one of the most important early instances of a bilateral science-based treaty negotiated by the federal government. The significance of this treaty was not just what it accomplished (even though it served as an exemplar for other environmental treaties between the United States and its neighbors, including the Colorado River water treaty signed with Mexico in 1944). It also underscored the growing appeal of conservation values among middle-and upper-class American citizens, who joined with scientists to create nature preserves in unspoiled wilderness areas outside the United States, particularly in Africa. In such places, "nature appreciation" emerged as a commodity for tourism, its value determined by declining opportunities to experience wilderness in the North American continent. Private investments of this kind became a potent area of U.S. influence in the world's less developed areas, and took place alongside more traditional interactions including trade relations and missionary work.

WORLD WAR II AND THE EARLY COLD WAR

Science and technology entered a new phase in American foreign relations at the end of the 1930s. Gathering war clouds in western Europe convinced scientists and military leaders that greater attention had to be paid to scientific and technological developments that might aid the United States and its allies. World War II and the ensuing Cold War marked a fundamental watershed in the role that science and scientists would play in American diplomatic efforts. By the late 1940s, new institutions for international science arose within an unprecedented variety of settings (including the Department of State and the Central Intelligence Agency). Secrecy concerns influenced the practice of science and international communications, and new career opportunities arose as science and technology became significant in U.S. foreign policy as never before.

The integration of science into U.S. foreign policy during World War II initially came from the urging of scientists. In August 1939, just months after the German chemist Otto Hahn and Austrian physicist Lise Meitner, working with others, discovered that heavy atomic nuclei could be split to release energy, three scientists including Albert Einstein urged President Franklin D. Roosevelt to fund a crash program to see if an atomic bomb could be constructed. The Manhattan Project that ultimately resulted became the largest research project in the United States to date, one that involved intense and active cooperation with scientists from Great Britain and Canada. Advanced research in the United States also benefited from the emigration of outstanding Jewish scientists from Germany and Italy after the rise of Adolf Hitler and Benito Mussolini. But the atomic bomb project was only one area of international scientific cooperation: in 1940 the eminent British scientific leader Sir Henry Tizard flew to Washington on a secret mission to persuade the U.S. government to cooperate in building a system of radar and radar countermeasures. The Tizard mission laid the groundwork for effective Allied cooperation in building a wide range of science-based technological systems, including radar, the proximity fuze, and the atomic bomb. Scientists who served within the U.S. Office of Scientific Research and Development, with access to greater manufacturing capacity than Britain, also put into production the new drug penicillin.

Concern with devising new wartime weapon systems was equaled by strenuous Allied efforts to discover what science-based weapon systems Germany and Japan had constructed. Through such bilateral efforts, World War II thus nurtured two critical developments that would shape science and technology in the postwar world: the imposition of secrecy systems to protect national security concerns, and the creation of scientific intelligence programs to discover foreign progress in science and technology (particularly but not limited to advances in weaponry). Like penicillin, scientific intelligence was largely a British invention: British scientific intelligence was more advanced than U.S. efforts at the start of the war, owing to its need to buttress its island defenses. But by 1944 U.S. leaders joined Allied efforts to send scientific intelligence teams behind the front lines of advancing Allied troops in western Europe, known as the ALSOS intelligence mission. While the most famous and best-remembered goal of the ALSOS teams was to discover whether Germany had built its own atomic bomb, this was only part of its larger mission to determine German advances in biological and chemical weapons, aeronautical and guided-missile research, and related scientific and technological systems. Broad fields of science were now for the first time relevant to foreign policy concerns.

Allied scientific intelligence missions also served another function: to catalog and inventory German and Japanese research and technological facilities as assets in determining wartime reparations and postwar science policy in these defeated nations. Both Soviet and Allied occupational armies sent back scientific instruments and research results as war booty. In Germany, where the U.S. and Soviet armies converged in April 1945, U.S. science advisers sought to locate and capture German rocket experts who had built the V-2 guided missiles, including Wernher von Braun. Von Braun's team was soon brought to the United States under Project Paperclip, an army program that processed hundreds of Axis researchers without standard immigration screening for evidence of Nazi war crimes. Operation Paperclip was the most visible symbol of a concerted campaign to secure astronomers, mathematicians, biologists, chemists, and other highly trained individuals to aid American research critical for national security. In Japan, U.S. scientists focused primarily on wartime Japanese advances in biological warfare. While members of the Japanese Scientific Intelligence Mission that accompanied General Douglas MacArthur's occupation forces were unable to stop the senseless destruction of a research reactor by U.S. soldiers, science advisers successfully insisted that applied science and technology were critical components of Japan's economic recovery.

Above all it was the use of atomic weapons against Japan in the closing days of World War II that brought science and technology into the realm of U.S. foreign policy as never before. The roughly 140,000 who died immediately at Nagasaki and Hiroshima, combined with the awesome destructive power of a device that relied on the fundamental forces of nature, made the atomic bomb the enduring symbol of the marriage of science and the state. In subsequent decades the U.S. decision to employ atomic weapons has become one of the most fiercely debated events in American foreign policy. Even before the bomb decision was made, a number of American atomic scientists protested plans to use nuclear weapons against Japan since it, unlike Nazi Germany, lacked the capacity to construct atomic weapons of its own. How the decision to use the bomb was made has split historians. Some have argued that U.S. leaders sought to end the war before the Soviet Union could officially declare war on Japan and thus participate in its postwar government, but many have concluded that other motivations were at least as important, including fears that Japanese leaders might have fought far longer without a show of overwhelming force and domestic expectations that all available weapons be used to conclude the war. Others have pointed out that U.S. policymakers had long seemed especially attracted to the use of technology in its dealings with Asian countries.

The largest conflict over nuclear weapons in the immediate postwar period involved the American monopoly over them, and how the United States could best safeguard the postwar peace. Bernard Baruch, the financier and statesman, proposed that atomic power be placed under international control through the newly established United Nations. The Soviet Union vetoed the Baruch Plan, believing that the proposal was designed to prevent it from acquiring nuclear weapons. Meanwhile, conservatives promoted a congressional bill that placed atomic energy under military control. Liberal scientists opposed the bill and advocated civilian control instead. In 1946, with the support of President Harry S. Truman, a Senate committee under Brien McMahon drafted a new bill that eventually resulted in a civilian-led (but militarily responsive) Atomic Energy Commission (AEC), one of the first postwar agencies designed to address science in foreign policy.

As the Cold War began, debate over science and technology in American foreign policy split along familiar lines. The most well-known of these involved efforts to maintain the deeply eroded traditions of scientific internationalism. Atomic scientists who supported international control of atomic energy created new national organizations, including the Federation of American Scientists. Participating scientists, including Albert Einstein, argued that physicists could aid the development of world government that would avoid the political perils of atomic warfare. In July 1957 nuclear scientists convened the first Pugwash meeting, drawing nuclear scientists from Western and communist nations to discuss approaches to nuclear disarmament. But promoters of scientific internationalism were not solely interested in atomic issues. The liberal internationalist and Harvard astronomer Harlow Shapley backed prominent British scientists Julian Huxley and Joseph Needham in their efforts to highlight science within the United Nations Educational, Scientific, and Cultural Organization (UNESCO). Leaders of the Rockefeller Foundation launched major new science initiatives in Latin America, while the National Academy of Sciences urged policymakers not to restrict American access to the world community of science. While public support for these positions remained high during the early years of the Cold War, they faded after Soviet Premier Joseph Stalin resumed a well-publicized crackdown on "bourgeois" research in genetics in favor of Trofin Lysenko's promotion of Lamarckian inheritance. This repression convinced many Americans that objective Soviet science had succumbed to state control. By the McCarthy era unrepentant internationalists were targets of a growing conservative backlash. The biochemist and Nobel Laureate Linus Paulingwho won a second Nobel Prize in 1962 for his campaign to end nuclear testingwas one of several outspoken American scientists whose passport was temporarily revoked in the 1950s.

At the same time, other scientists began working with government officials in Washington, sometimes clandestinely, to investigate ways that scientists could aid U.S. national security by addressing major issues in American foreign policy. These activities took many forms. One of the more visible steps came in 1949, when President Truman announced, as the fourth point of his inaugural speech, that the United States was willing to "embark on a bold new program for making the benefit of our scientific advances and industrial progress available for the improvement and growth of under-developed areas." After Congress approved the so-called Point Four program a year later, tens of millions of dollars supported bilateral projects in science education, public health, agriculture, and civil engineering, adding to mainstream Marshall Plan funds used to restore technological and scientific capacity in the warravaged nations of western Europe. At the same time, U.S. scientists and technical experts worked to thwart Soviet efforts to obtain advanced Western computers, electronic devices, and other technologies and resources critical to weapons development. These included efforts to limit export of weapons-grade uranium to the Soviet Union and to deny Soviet access to Scandinavian heavy water as well as prominent Swedish scientists in the event of a Soviet invasion.

For U.S. policymakers, a principal challenge was to secure reliable overt and covert information on the scientific and technological capacity of other nations, since such intelligence was necessary to match enemy advances in weaponryparticularly in biological, chemical, and radiological warfare. A major point of intersection between physicists and U.S. policymakers came in efforts to discern Soviet advances in atomic bomb work and in developing methods to detect and analyze Soviet atomic tests, a task that gained greater urgency after the Soviet Union exploded its first nuclear device in August 1949. Hindered by a paltry flow of overt information from communist countries, U.S. scientists sought alternative means to secure such data. In 1947 several scientists who had managed the wartime U.S. science effort, including Vannevar Bush, James Conant, and Lloyd V. Berkner, helped create a set of new institutions devoted to the role of international science in national security. The first was the Office of Scientific Intelligence within the newly formed Central Intelligence Agency. Three years later, scientists working with the Department of State created a scientific attaché program, patterned on the U.K. Science Mission. A 1950 Berkner report to Secretary of State Dean Acheson, justifying this effort, declared that the program would strengthen Western science while providing American scientists and businesses helpful information; a secret supplement optimistically spelled out ways that attachés could covertly secure needed intelligence. Yet by 1952, national security experts concluded that foreign science and technology intelligence-gathering from the CIA and the Department of State remained woefully inadequate. The United States then created the top-secret National Security Agency to foster signals intelligence, employing the clandestine code-breaking strategies that had aided Allied victory during World War II.

Scientists and policymakers both found the abrupt integration of science into U.S. foreign policy unnerving. Many American scientists recognized that post-1945 national security concerns required pragmatic compromise of the unfettered exchange of information that had long been the ideal of science. The close relations that developed between scientists and the government during World War II also helped certain scientists undertake clandestine research programs. But most American scientists resented increasingly tight security restrictions, demands for secrecy, loyalty oaths, and mandatory debriefings by federal agents following overseas professional trips. Scientists who accepted posts in the State Department felt the snubs of colleagues who regarded such service less prestigious than lab-bench research. For their part, traditional foreign relations experts, trained in economics or history, were largely unfamiliar with the concepts or practices of science, disdained the capacity of scientists in war-ravaged western Europe and the Soviet Union to produce quality science, and perceived the inherent internationalism of scientists suspicious if not unpatriotic. Such views were widespread within the national security bureaucracy. Federal Bureau of Investigation director J. Edgar Hoover, familiar with top-secret Venona intercepts of encrypted Soviet communications used to discover atomic spies in the United States, regarded the internationalism of scientists as a threat to democracy and the proper aims of U.S. foreign policy.

Despite these mutual tensions, American leaders in the 1950s nonetheless sought to use science to influence foreign policy debates. Officials used scientific intelligence to refute highly publicized (and still unresolved) Chinese claims that American forces in Korea had violated international accords by employing bacteriological weapons in the winter of 1952. Even greater use of science as an ideological weapon was made by President Dwight Eisenhower, who in a major speech to the United Nations General Assembly in December 1953 offered his "Atoms for Peace" proposal calling for the peaceful uses of atomic power. Regarded at the time as a Marshall Plan for atomic energy, Atoms for Peace promoted the development of nuclear cooperation, trade, and nonproliferation efforts in noncommunist nations; it also provided nuclear research reactors to countries in South America and Asia. Eisenhower's advisers felt certain that the Soviet Union could not match the Atoms for Peace offer, and hence would suffer a political setback as a result. They also believed it would reduce the threat of nuclear warfare, an anxiety shared by western European leaders after the United States explicitly made massive retaliation the cornerstone of its national security policy.

Historians have debated the significance and meaning of the Atoms for Peace proposal. On the one hand, some maintain that Eisenhower correctly perceived that the most effective means of halting nuclear proliferation would come from promoting and regulating nuclear power through the auspices of the United Nations, while ensuring that the European western democracies would gain direct access to what at the time seemed a safe and low-cost source of energy. The program helped the United States secure 90 percent of the reactor export market by the 1960s. On the other hand, critics charge that Atoms for Peace actually served to increase the danger of nuclear proliferation. Yet other historians regard Atoms for Peace as part of a grander strategy to mute criticism of the accelerated buildup of U.S. nuclear weapons stockpiles and their secret dispersal to locations around the world, including West Germany, Greenland, Iceland, South Korea, and Taiwan. It is also clear that Eisenhower sought to exploit the apolitical reputation of science to wage psychological warfare and to gather strategic intelligence. In the mid-1950s the Eisenhower administration approved funds for the International Geophysical Year (IGY) of 19571958, an enormous effort to study the terrestrial environment involving tens of thousands of scientists from sixty-seven nations (a plan conceived, among others, by science adviser Lloyd Berkner). In one sense, Eisenhower's support for the IGY was overdetermined: policymakers saw an advantage in limiting rival nations' territorial claims to Antarctica by making the frozen realm a "continent for science" under IGY auspices, and Eisenhower recognized that a planned "scientific" satellite launch would enhance international claims for overflight of other nations' airspace, a concern because of U.S. reliance on high-altitude U-2 aircraft fights to gain intelligence on the Soviet Union. It was a strategy that his predecessor, Thomas Jefferson, had also understood.

Despite their greater involvement in foreign policymaking, scientists largely remained outsiders from diplomatic circles. This was due to several factors. Throughout his first term, Eisenhower maintained his small staff of science advisers in the Office of Defense Management, a marginal agency remote from the machinery of the White House. More importantly, the White House failed to defend scientists against charges from Senator Joseph McCarthy and the House Un-American Activities Committee that cast dispersions against the loyalty of atomic scientists, particularly after the Soviet atomic bomb test of 1949. With the declassification of the Venona intercepts, historians now understand that American espionage did provide Soviet agents with details of the "Fat Man" plutonium implosion bomb used at Nagasaki, giving Soviet physicists perhaps a year's advantage in constructing their own initial atomic weapon. This level of spying was greater than many on the left then believed, but far less than what Republican critics of scientific internationalism charged. These highly publicized accusations, and the loyalty investigation of atomic bomb project leader J. Robert Oppenheimer, nevertheless aided ideological conservatives convinced that scientists represented a threat to national security and that international science needed to be controlled along with foreign cultural and intellectual exchange. After the conservative-leaning U.S. News and World Report in 1953 reported a claim that the State Department's science office was "a stink hole of out-and-out Communists," Secretary of State John Foster Dulles, ignoring the protests of scientists, allowed the science attaché program to wither away.

These clashes pointed to fundamental tensions in efforts to employ science in American foreign policy. Moderates in the executive branch sought to use scientific internationalism to embarrass Soviet bloc countries by advertising links between Western democracy and achievements in science and technology (a theme heavily promoted in the Brussels World Exposition of 1958). Many believed that scientists in communist nations were the most likely agents for democratization and thus potential allies. Opposing them were ideological conservatives determined to limit international science contacts to strengthen national security and to restore clarity to U.S. foreign policy. These tensions came to a head in the mid-1950s when State Department officials refused to pay U.S. dues to parent international scientific unions in part because unrecognized regimes, including Communist China, were also members. American dues were instead quietly paid by the Ford Foundation, whose directors understood that the CIA's scientific intelligence branch greatly benefited from informal information and insights passed on by traveling American scientists. While the CIA's clandestine support for scientific internationalism helped sustain U.S. participation in major international bodies in the nadir of the Cold War, this conflict would not be resolved before the Sputnik crisis interceded.

SPUTNIK, THE ANTICOLONIAL REVOLUTION, AND SCIENCE AS AN IDEOLOGICAL WEAPON

By the late 1950s a second fundamental shift occurred in the role of science and technology in U.S. foreign policy. The shift had several causes. One was the launch of Sputnik, which established the Soviets as a potent technological force in the eyes of observers throughout the world, including western Europe. Another was that the Soviet Union's space spectacular occurred in the midst of the independence movement among former colonies in Africa and Asia. This worried U.S. officials who believed that Soviet triumphs in applied science and technology would tempt these emerging nations to develop socialist governments and build alliances with the Eastern bloc. Yet another factor was the heightened role of science in new multilateral treaty negotiations, including the Antarctic Treaty and the Limited Nuclear Test Ban Treaty, which brought scientists and policymakers into ever tighter orbits. Finally, increasing concern from American citizens about an environment at risk from radioactive fallouta view shared by leaders of western European governmentshelped make a wide range of environmental concerns from declining fish populations to improving agricultural productivity and addressing air and water pollution a greater focus of American foreign policy. Together, these led to a considerable transformation of U.S. foreign policy, increasing the influence of United Nations and nongovernmental organizations, and heightening diplomatic links between the northern and southern hemispheres. While efforts to coordinate U.S. science policy remained ineffective, and relations between scientists and policymakers were sometimes strained, this realignment would persist through the end of the twentieth century.

The launch of Sputnik was a major foreign relations setback to the United States, in no small part because of American faith in its technology and a widespread conviction in the West that scientific and technological development within a democracy would triumph over that within a totalitarian state. But on 4 October 1957, the 184-pound Sputnik I, emitting a pulsed electronic beep, became the Earth's first artificial satellite. The launch produced banner headlines around the world and convinced many Allies that Eastern bloc science and technology was equal to that of the United States. Secret U.S. Information Agency polling in Britain and in western Europe indicated that a quarter of their populations believed the Soviet Union was ahead in science and technology. In response, the United States accelerated programs designed to symbolize the nation's scientific and material progress, above all the space program. For the next quarter century science and technology would take on a new role in foreign policyas a surrogate for national prosperity and stability.

Elevating science and technology as symbols of national potency, and hence as tools of foreign policy, took several forms. One was by investing in highly visible technological projects. The space program developed by the National Aeronautics and Space Administration (NASA) was a prime example. Technology as a symbol of national prestige was embodied in the bold (and ultimately successful) proposal to land humans on the moon by 1969, which President John F. Kennedy announced in a speech to Congress in May 1961 after his most embarrassing foreign policy failure, the Bay of Pigs disaster. But this was only one expression of many. The Kennedy administration also stepped up international programs in such fields as agriculture, medicine, and oceanography. As with the Wilkes Expedition a century before, the motivations behind such efforts were mixed. New research programs in oceanography were intended to help increase fish harvests by less developed countries, and American oceanographic vessels could show the flag at distant points of call. But oceanography was also a particularly strategic field because of growing concerns with antisubmarine warfare and efforts by less developed countries, working through United Nations bureaus, to extend their sovereignty to two hundred nautical miles beyond their coasts. Knowing the sizes of Soviet fish harvests was also of strategic value. Undertakings such as the multinational Indian Ocean Expedition of 19641965, which American scientists helped plan, seamlessly embodied all of these aims.

Science constituencies both within and outside the federal government responded to the Soviet achievement in various ways. Worried air force officials, anxious to demonstrate U.S. technological competence in the months following the launch of Sputnik, proposed detonating a Hiroshima-sized bomb on the moon in 1959 that would be instantaneously visible to watchers from Earth. Cooler heads at the Department of State and the White House did not consider this idea because of its militaristic connotations. The National Science Foundation advocated increasing the number of exchanges between U.S. and Soviet scientists, while White House staff members supported the AEC's Plowshare program to make peaceful uses of atomic bombs, among them creating new canals and harbors. Members of Congress echoed private science groups in arguing that the Sputnik crisis showed that the United States had fallen behind in training future scientists. The massive rise in federal spending for math and science education after 1958 was another direct consequence of this foreign relations crisis.

The Sputnik shock forced administration officials to recognize that existing mechanisms for coordinating science and technology within foreign policy were inadequate. In 1957, President Eisenhower announced the creation of the position of special assistant to the president for science and technology (commonly known as the presidential science adviser) and the President's Science Advisory Committee (PSAC) to provide the White House with advice on scientific and technical matters domestic and foreign. While members of PSAC, which was always chaired by the science adviser, were initially drawn from the physical sciences, reflecting continued preoccupation with space, nuclear weapons, and guided-missile delivery systems, PSAC's mandate soon expanded to include a wide range of scientific disciplines. The State Department's Science Office and attaché program, nearly eviscerated before Sputnik, was revived and handed new responsibilities for coordinating bilateral and multilateral programs. Not all government officials saw the increased focus on science and technology as positive. A Latin American ambassador complained that the U.S. embassy in Rio de Janeiro "needs a science attaché the way a cigar-store Indian needs a brassiere." Despite such criticisms, Washington exported these conceptions into its regional security alliances, creating a new science directorate within the North Atlantic Treaty Organization (NATO). While Democrats worried that this plan would militarize western European science and limit contacts with Soviet colleagues, NATO's science directorate steered new research contracts to its closest allies.

Another response to the Sputnik crisis was a dramatic expansion of foreign aid programs to support science and technology. In 1961 President Kennedy announced the creation of the Agency for International Development (AID), with an explicit mandate to fund research, education, and technology-based programs around the world. Advocates of old-style scientific internationalism supported AID programs as a way to extend UN programs that nurtured emerging research centers and sustainable development in less developed countries. In certain respects they were not disappointed: AID science programs provided significantly greater support to Latin-American countries in the 1960s and 1970s than their feeble counterparts in the early Cold War period. Grants funded desalination projects, teacher training, and scientific equipment; in cooperation with science attachés, officials also protested the mistreatment of academics in Argentina and Brazil in the 1960s. But as with the Marshall Plan, foreign aid programs in science and technology were adjuncts in the greater struggle to extend U.S. influence to Latin America, the Asian subcontinent, and sub-Saharan Africa, and to win the hearts and minds of leaders in less developed countries deciding between Western and Soviet models of economic development. In practice, however, it was often difficult to separate humanitarian motives from calculation of Realpolitik. U.S. support for costly rain experiments in India's Bihar-Uttar Pradesh area in the mid-1960s was justified by noting that these programs aided American policy aims by mitigating Indian embarrassment at lagging behind Chinese efforts to create an atomic bomb. But this secret research, however fanciful, did attempt to mitigate a life-threatening drought.

The best-known science and technology foreign-assistance program from this period was the Green Revolution. Based on hybrid forms of rice and wheat that had been developed in the United States in the 1930s, the Green Revolution promised to allow poorer nations to avoid the Malthusian dilemma by increasing the efficiency of planted fields to satisfy the demands of growing populations. In India, where severe drought crippled crops between 1965 and 1967, the planting of high-yield grains nearly doubled wheat and rice yields by the late 1970s. Stimulated and financed by the Rockefeller and the Ford Foundations, the Green Revolution was one of the most well-known private foreign aid programs during the Cold War.

Historians have reached differing conclusions about the impact and effectiveness of U.S. scientific and technological aid programs to Latin America and to sub-Saharan Africa in the 1960s and 1970s. Some argue that American aid programs in science and technology represent long-nurtured humanitarian impulses similar to those that informed the Marshall Plan and in general no less successful. Few scholars doubt that the American scientists and policy officials who designed these programs genuinely believed their efforts would achieve positive social ends. However, other historians have pointed out that scientists who sought grandiose results such as weather modification and greatly enlarged fish catches were overconfident about their ability to master nature without harming natural processes, and recent assessments of the Green Revolution have made clear that production gains were less than earlier claimed. A more significant problem was that planners often failed to realize that technical systems developed in advanced capitalistic countries could not be transported wholesale into other regions without concurrent local innovations and adaptive technologies. American enthusiasm about exporting the fruits of U.S. technologies was often accompanied by hubris in assessing the environments of less developed countries.

Beginning in the 1960s, American policymakers also faced new demands to negotiate international agreements governing applications of science and technology. A convergence of factors brought this about. The economic costs of maintaining the U.S. nuclear arsenal, concerns about proliferation, and a desire to moderate the arms race led the Eisenhower administration to begin discussions with the Soviet Union about what became the 1963 Limited Nuclear Test Ban Treaty. The close call narrowly avoided in the Cuban missile crisis of 1962 inspired President Kennedy and Premier Nikita Khrushchev to sign it. But another reason was the growing realization among scientists and policymakers that even the testing of nuclear, biological, and chemical weapons represented a genuine threat to the health of American citizens and populations worldwide, and that such tests could have unintended consequences for diplomatic relations and regional stability. From secret monitoring of manmade radioactivity levels in the 1950s, scientists understood that measurable amounts had already spread worldwide. Policymakers were also unnerved by the "Bravo" nuclear test on Bikini Island in March 1954, a fifteen-megaton blast more than a thousand times the size of the Hiroshima bomb. Radioactive ash from the test spread across a broader area of the Pacific than expected, contaminating the Japanese tuna ship Lucky Dragon and in turn causing a panic in the Japanese fishing market and outrage in Japan and elsewhere. National Security Council members worried that a disruption of Japan's primary food resource might destabilize government and allow Soviet encroachment. Amplifying these worries was growing popular concern with an environment at risk, accentuated by anxiety concerning nuclear and chemical fallout and the contaminants issue exemplified by Rachel Carson's 1962 Silent Spring. International treaties served policymakers' ends by reassuring citizens of limitations on uses of science-based weapon systems that many Americans found unsafe and threatening.

To be sure, policymakers often found it difficult to steer science to aid foreign policy goals, in part resulting from the elite nature of science, in part because the goals of scientists were often tangential to those of the state. But part of the problem was that by the 1960s policymakers could no longer count on a compliant media to keep covert activities involving international scientific activities secret. In 1962, the New York Times reported a highly secret test of a U.S. atomic bomb exploded in outer space eight hundred miles from Hawaii, code-named Starfish. The resulting controversy intensified suspicions of citizen groups on the left that science had become an extension of state power and morally suspect. Though U.S. officials successfully concealed many related projects from view, demands for greater openness led the 1975 Church Committee to examine unauthorized medical experiments within the CIA, and subsequent revelations about U.S. efforts to employ radiological warfare and to steer hurricanes toward enemy lands raised ethical dilemmas for many citizens. Yet at times the government successfully mobilized public support behind using science as a moral weapon. In 1982 the U.S. government canceled its bilateral science agreements with the Soviet Union to protest its treatment of atomic physicist and dissident Andrei Sakharov and its persecution of Jewish scientists. But at least as often relations between policymakers and their scientific advisers fractured. President Richard Nixon abolished PSAC in 1973 for its opposition to his antiballistic missile, supersonic transport, and Vietnam policies. In 1983 President Ronald Reagan announced his decision to proceed with his "Star Wars" Strategic Defense Initiative after consulting a small circle of scientists, bypassing standard review circles in an attempt to use science for strategic advantage.

By the 1970s and 1980s, policymakers also found that the critical defining relations for international science were no longer exclusively East-West but also North-South, between the developed and developing nations. U.S. scientists and diplomats were slower to react to this change than to the upheavals of anticolonialism in the late 1950s, misperceiving the significance of the change. When the Pakistani physicist and Nobel Laureate Abdus Salam created the International Center for Theoretical Physics in Trieste, Italy, in 1964, a center devoted to researchers from less developed nations, leading U.S. scientists and policymakers criticized Salam's plan as simply duplicating existing Western research facilities. But Salam's institute (backed by the United Nations and private foundations) was soon followed by parallel efforts in other fields, whose leaders sought to set research agendas reflecting the peculiar needs of these developing lands. Although often wary of these new centers (which reflected the growing influence of the UN, UNESCO, and other multilateral agencies such as the International Atomic Energy Agency remote from American influence), U.S. officials sought to remain appraised of their activities.

Even if science sometimes seemed an uncertain asset in American foreign policy, U.S. policy-makers continued to regard technology as a key indicator of the superiority of American capitalism, illuminating the nation's core values of productivity and resourcefulness. Most Americans still believed that technological solutions existed for a large range of social and political problems. Early in the Cold War, many Americans suggested that Soviet citizens would revolt if sent Sears catalogs showing a cornucopia of American products, and their faith in technological fixes persisted after the launch of Sputnik. Perhaps technology, as embodied in military power, could cut through cultural differences to get the American message across. There was a sense of technological superiority on the part of American policymakers with a penchant for technological solutions to complex social and political problems in U.S. interactions with Asian countries. This was especially the case during the Vietnam War, when American scientists, engineers, military, and civilian leaders worked together to create and implement carpet bombing, defoliants, and electronic battlefields.

American policymakers also sought to capitalize on Asian countries' desire to catch up with the West in science and technology. This interest was not new: the U.S. government, when returning part of the Boxer indemnities to China in the early 1900s, had stipulated that the Chinese government had to use the returned funds for sending students to the United States to study science and technology-related subjects. As a result, the Boxer fellowships helped train several generations of Chinese scientists and engineers. In the 1970s and 1980s, American policymakers again hoped that American science and technology would play a role in the reopening and the normalization of U.S.China relations. The Shanghai Communique signed by Henry Kissinger and Zhou Enlai during Richard Nixon's famous trip to China in 1972 highlighted science and technology, along with culture, sports, and journalism, as areas for people-to-people contacts and exchanges. Indeed, the ensuing exchange of students and scholars, including large numbers of scientists and engineers, shaped U.S.China relations in many ways during this period. In this connection, the disproportionately large number of Chinese Americans who work in science and technology-related fields often played an important role in facilitating such exchanges and in mitigating U.S.China tensions.

Faith in technological solutions to problems of U.S. foreign policy remained evident in the waning days of the Cold War, even as significant manufacturing sectors were shifted from the United States to lower-cost labor markets throughout the globe. This same faith was applied to relations with the Soviet Union. As historian Walter LaFeber has noted, Secretary of State George P. Shultz learned about the rapid advances of information technology and communications in the early 1980s, at the start of the Reagan presidency. He decided that communications technology could be used to make the Soviet Union face a potentially undermining choice: to yield control over information, at the cost of weakening the system, or maintaining communist controls at the cost of dramatically weakening its science and technology (and hence its economy and military). Against the advice of intelligence and State Department officials who saw few inherent technological weaknesses to exploit within the Soviet system, and convinced that the information revolution would lead to decentralized rather than central controls, Shultz pressed to bring this hard choice to the fore of American Soviet policy. While the decline and ultimate collapse of the Soviet Union resulted from a complex set of social, political, and technological factors, modern information technology had become an important tool in U.S. foreign policy.

THE END OF THE TWENTIETH CENTURY

The fall of the Berlin Wall in 1989, and the collapse of the Soviet Union two years later, accelerated two significant and already evident trends. The first was the decreased ability of the federal government to regulate the involvement of Americans in international science and technological ventures. This decline owed to further advances in communications technology, the continued globalization of manufacture and research, and an unprecedented expansion of nonprofit organizations involved in myriad aspects of foreign science policy. The second was greater international support for global treaties designed to limit technologies that threatened the natural environment.

Reduced state control over the conduct and practice of science and technology as aspects of foreign policy had several causes. One was the general relaxation of state restrictions that followed the end of the Cold War, including a reduced level of concern about the threat of nuclear annihilation (though, as the abortive spy trial of the Los Alamos physicist Wen Ho Lee in the late 1990s would attest, the federal government remained vigilant, or even overzealous, as critics charged, about prosecuting alleged violations of nuclear secrets trade). By 1990, international scientific exchanges had become so commonplace that the Department of State, which thirty years before had scrutinized each case, gave up trying to count them. Yet another was the rising influence of the biological and environmental sciences, challenging the dominance of the physical sciences as the key determinant of foreign policy in the sciences and providing nongovernmental organizations greater influence on policy decisions. In 1995 some 110,000 biological and life scientists were employed by the federal government, double the number from twelve years before. Well-funded conservation groups such as the World Wildlife Federation continued to export wilderness values and sustainable development concerns around the globe, including that for the Amazon rainforests, while more militant organizations, including Greenpeace, succeeded in stimulating public pressure to address problems with international whaling practices and the regulation of drilling platforms in international waters. No less influential were private foundationsnotably the Bill and Melinda Gates Foundation, which announced a $100 million commitment to international AIDS research in 2001their undertaking reminiscent of the early twentieth century foreign health campaigns of the Rockefeller Foundation. But commercial concerns from powerful business interests also shaped State Department policies toward international science and technology, particularly as the growing commercial value of products derived from molecular biology and genetics inspired Eli Lilly, Hoffman-LaRoche, Genentech, and other large multinational firms to organize research and production facilities on a global scale.

Another factor that undermined the ability of the state to regulate international science and technological projects was the increasingly transnational character of fundamental scientific research. While the institutional structure of science remained largely national in charactersince the state remained the dominant patron of scientific researchscientists found fewer barriers to participating in international collaborations than at any prior time in history. Transnational coauthorships in leading scientific nations reached 19 percent by the mid-1980s, and scientists found it easier to cross borders to conduct experiments at major foreign research facilities and to attend conferences in once off-limit cities such as Havana and Beijing. Financial exhaustion caused by the Cold War also inspired new transnational technological collaborations, including the U.S.Russian space station, the Cassini Mission to Saturn, and the multinational Human Genome Project, the first big-science undertaking in the biological sciences. While Washington policymakers generally saw these developments as advantageous to U.S. interests, the reduction of centralized controls over technical systems occasionally disturbed security-conscious officials. During the administration of President William Jefferson Clinton, law enforcement agencies attempted to restrict the importation of foreign encryption programs, seeking to retain access to information transmitted via computers for criminal investigations and national security purposes, but technological firms successfully resisted this effort.

But the ending of the Cold War, which left the United States as the sole surviving superpower, also caused policymakers to scale back on efforts to convince other world leaders of the merits of capitalist-based science and technology. Despite calls for a new Marshall Plan to aid the democratic transformation of the former Soviet Union (which included providing ways to keep unemployed Russian nuclear technicians and bioweapons specialists from taking their skills to Iran, Libya, and other sponsors of international terrorism), the United States provided little support. Private efforts to provide such support did not succeed, despite a $100 million investment provided by the financier George Soros from 1992 to 1995. Soros argued (as American national security advisers had done throughout the Cold War) that Russian scientists were bulwarks of liberal democracy and antidotes to religious fundamentalism and mystical cults, but terminated his support when Western democracies failed to match his contributions. While citizens generally backed such measures, budget constraints did not permit policymakers to offer more than patchy responses to these problems.

The United States and other Western governments have proven more inclined to address the impact of scientific and technological developments on the global environment, seeing these threats as more immediate and more amenable to international negotiation. By the 1980s and 1990s, American leaders began playing active roles in negotiating treaties that sought to mitigate the effects of industrial and military byproducts in the environment, including efforts to maintain biodiversity, to reduce the destruction of ultraviolet-shielding stratospheric ozone, and to limit the emission of carbon dioxide and other greenhouse gases that heightened global warming. In certain respects these treaties resembled the 1963 Limited Nuclear Test Ban Treaty, which limited the global spread of radioactive fallout. Like the much earlier Migratory Bird Treaty Act of 1918, these also sought to employ the best scientific knowledge available to address an evident problem, and they were controversial in their day. But these late twentieth-century treaties were profoundly different from their predecessors in several ways: they posed major economic and national security questions at the highest levels of government, they involved the full-time work of large numbers of scientists and policymakers, and they addressed issues intensely familiar to citizens (by 1989, 80 percent of Americans had heard of global warming). They were also multilateral treaties rather than bilateralas most earlier international environmental treaties had beenthus reflecting the growing influence of the United Nations as a force in international science policy. In the mid-1990s the Clinton administration, aware that a majority of Americans backed these efforts (and believing, as historian Samuel P. Hays has argued, that they reflected deep-rooted American values about the environment), explicitly declared its support for environmental diplomacy. The Clinton administration also suggested that environmental degradation could lead to political and social stress, even major instability, and thus became the first to publicly argue that water rights disputes and overfishing were as significant in foreign policy as traditional issues of ideology, commerce, and immigration.

By the beginning of the twenty-first century, U.S. willingness to take part in the postCold War framework of international science-based treaties appeared to wane. During his first six months in office, President George W. Bush signaled his intention to take a more unilateral stance, refusing to sign the Kyoto Accord on global warming while backing away from the 1996 Nuclear Test Ban Treaty and a pact designed to enforce an international ban on biological weapons (which powerful U.S. biotech groups had opposed, fearing the loss of trade secrets). In the early summer of 2001 Secretary of Defense Donald H. Rumsfeld voiced willingness to "cast away" the 1972 antiballistic missile treaty, the bedrock of mutually assured destruction that had guided U.S. nuclear weapons policy throughout the Cold War era. These actions are a reminder that conservative concerns about limiting American power and the political unreliability of scientists have not faded. Yet these efforts ought not be taken as a sign of a major reorientation of the role of science and technology within U.S. foreign policy. The growth of an international framework for science and technology was largely determined by events beyond the control of the American people, who remain part of an international science and technological community more extensive than many realize. Constituencies for this system, within scientific community and within Congress and bureaucracy, are large. As with environmental values within the United States, global approaches to environmental regulation have gained favor with a significant portion of the U.S. population, and will remain a driving force in setting U.S. foreign policy.

BIBLIOGRAPHY

Badash, Lawrence. Scientists and the Development of Nuclear Weapons: From Fission to the Limited Test Ban Treaty, 19391963. Atlantic Highlands, N.J., 1995.

Bamford, James. Body of Secrets: Anatomy of the Ultra-Secret National Security Agency: From the Cold War Through the Dawn of a New Century. New York, 2001.

Barth, Kai-Henrik. "Science and Politics in Early Nuclear Test Ban Negotiations." Physics Today 51 (1998): 3439.

Cohen, I. Bernard. Science and the Founding Fathers: Science in the Political Thought of Thomas Jefferson, Benjamin Franklin, John Adams and James Madison. New York, 1995.

Cueto, Marcos, ed. Missionaries of Science: The Rockefeller Foundation and Latin America. Bloomington, Ind., 1994.

DeGreiff, Alexis. "A History of the International Centre for Theoretical Physics, 19601980. Ideology and Practice in a United Nations Institution for Scientific Co-operation for the Third World Development." Ph.D. dissertation. Imperial College London, 2001.

Divine, Robert A. The Sputnik Challenge. New York, 1993.

Doel, Ronald E., and Allan A. Needell. "Science, Scientists, and the CIA: Balancing International Ideals, National Needs, and Professional Opportunities." In Rhodri Jeffreys-Jones and Christopher Andrew, eds. Eternal Vigilance: Fifty Years of the CIA. London, 1997.

Dorsey, Kurk. The Dawn of Conservation Diplomacy: U.S.Canadian Wildlife Protection Treaties in the Progressive Era. Seattle, 1998.

Dupree, A. Hunter. Science in the Federal Government: A History of Policies and Activities. Baltimore, 1986.

Graham, Loren R. What Have We Learned About Science and Technology from the Russian Experience? Stanford, Calif., 1998.

Hays, Samuel P. Beauty, Health, and Permanence: Environmental Politics in the United States, 19551985. Cambridge, Mass., 1987.

Hindle, Brooke, and Steven Lubar. Engines of Change: The American Industrial Revolution, 17901860. Washington, D.C., 1986.

Holloway, David. Stalin and the Bomb: The Soviet Union and Atomic Energy, 19391956. New Haven, Conn., 1994.

Hughes, Thomas P. American Genesis: A Century of Invention and Technological Enthusiasm, 18701970. New York, 1989.

Kevles, Daniel J. "'Into Hostile Political Camps': The Reorganization of International Science in World War I." Isis 62 (1970): 4760.

LaFeber, Walter. "Technology and U.S. Foreign Relations." Diplomatic History 24 (2000): 119.

Manzione, Joseph. "'Amusing and Amazing and Practical and Military': The Legacy of Scientific Internationalism in American Foreign Policy, 19451963." Diplomatic History 24 (2000): 2156.

Needell, Allan A. Science, the Cold War, and the American State: Lloyd V. Berkner and the Balance of Professional Ideals. London, 2000.

Rydell, Robert W., John E. Findling, and Kimberly D. Pelle. Fair America: World's Fairs in the United States. Washington, D.C., 2000.

Schröder-Gudehus, Brigitte. "Nationalism and Internationalism." In R. C. Olby et al., eds. Companion to the History of Modern Science. London, 1990.

Skolnikoff, Eugene B. The Elusive Transformation: Science, Technology, and the Evolution of International Politics. Princeton, N.J., 1993.

Wang, Jessica. American Science in an Age of Anxiety: Scientists, Anticommunism, and the Cold War. Chapel Hill, N.C., 1999.

Wang, Zuoyue. "U.S.Chinese Scientific Exchange: A Case Study of State-Sponsored Scientific Internationalism During the Cold War and Beyond." Historical Studies in the Physical and Biological Sciences 30, no. 1 (1999): 249277.

Weiner, Charles. "A New Site for the Seminar: The Refugees and American Physics in the Thirties." Perspectives in American History 2 (1968): 190234.

Wright, Susan. "Evolution of Biological Warfare Policy." In Susan Wright, ed. Preventing a Biological Arms Race. Cambridge, Mass., 1990.

See also Environmental Diplomacy; Nuclear Strategy and Diplomacy; Outer Space; Philanthropy.

ON THE NEED TO SUSTAIN INTERNATIONAL SCIENCE AND TECHNOLOGY

"In the world of science America has come of age in the decade immediately preceding the second world war. Before this time, basic science was largely a European monopoly and Americans trained either in this country or abroad had large stores of accumulated ideas and facts on which to draw when building new industries or promoting new processes. The automobile, for example, was engineered from basic ideas many of which went back to Newton and the radio industry has developed from the late nineteenth century theories and experiments of Maxwell and Hertz. Unfortunately the technological advancements of the last war, extended as they were by every means possible, appear to have largely exhausted developments latent in the present store of basic knowledge. This means that, unless steps are taken, the technological development of really new industries will gradually become more difficult and that in time a general leveling off in progress will take place. The implication of this for America and particularly for American foreign policy could be quite serious for, if such a plateau is reached, other countries, such as Russia, could presumably catch up with or even surpass us in production and hence in military potential. The consequences of such an altered balance are not difficult to foresee. Competent American scientists have recognized this dilemma for some time and have consequently come to believe that efforts must be made to stimulate basic science throughout the world in order that subsequent development either in America or elsewhere will have something on which to feed."

R. Gordon Arneson, U.S. Department of State, Secret Memorandum, 2 February 1950 (declassified 22 July 1998)

ON SCIENTIFIC INTELLIGENCE

"Historically, the major responsibility for intelligence in the United States, both during war and peace, has rested upon the military agencies of the government. Since World War II, intelligence has assumed a far greater peacetime role than heretofore and has had an increasing influence upon foreign policy decisions.

"In the overall utilization of intelligence in the policy making areas of the Department [of State], there appears to be too little recognition of the enormous present and future importance of scientific intelligence. In the past, military and political factors, and more recently economic considerations, have been the controlling elements in estimating the capabilities and intentions of foreign powers. Now, however, an increasingly important consideration in any such assessment is the scientific progress of the country concerned. For example, the determining factor in a decision by the U.S.S.R. either to make war or to resort to international political blackmail may well be the state of its scientific and technological development in weapons of mass destruction. It is therefore imperative that, in the Department, the scientific potential and technical achievements of the Soviet Union and their implications be integrated with the other elements of a balanced intelligence estimate for foreign policy determination."

Lloyd Berkner, Report of the International Science Policy Survey Group (Secret), 18 April 1950 (declassified 22 July 1998)

Science and Technology

views updated May 23 2018

SCIENCE AND TECHNOLOGY

In many ways—political, sociological, economic, and technological—the end of the nineteenth century marked a decisive turning point in American history. Irrevocably shaping the understanding of "modern times," the decades after 1870 were crucial for the shift in American society from a weak agrarian republic in 1800 to a powerful industrial nation at the dawn of the twentieth century.

SCIENCE, INDUSTRIALIZATION, AND ACCELERATION IN THE LATE NINETEENTH CENTURY

The population of the United States exploded from roughly five million in 1800 to more than seventy seven million in 1900. At the turn of the nineteenth century only 322,000, or a mere 6 percent, of all Americans lived in cities. A hundred years later, the proportion was 40 percent or more than thirty million people. Simultaneously, new technological devices rapidly and dramatically altered the lifestyle of almost every American. Earlier inventions such as the power loom, the sewing machine, the steam-driven flatbed press, the steamboat, the locomotive, the telegraph, the camera, and the spectroscope had strongly influenced the attitudes and mind-sets of nineteenth-century Americans. The large-scale application of these devices to industrial processes and further insight into their underlying scientific principles led to a number of complementary innovations during the final decades of the century (the telephone, the phonograph, moving pictures, the steam turbine, the internal combustion engine, the airplane, and so on) and transformed the physical and psychological conditions of life in America. The second half of the century, then, saw the implementation of the groundbreaking, major discoveries made by the end of the 1850s.

The later phase systematized and applied the sciences to industry and technology on a large scale. By the end of the century, technology became the primary model for progress. Scientific invention based on systematic research and experimentation replaced the older, republican tradition of ingenious tinkering, which depended on tedious trial-and-error procedures or more often, sheer luck. In keeping with this new orientation, and despite its obvious elitist underpinnings, progressive Americans increasingly embraced a culture of technological expertise that valued highly specialized knowledge and favored scientific solutions for practical, everyday problems. Every area of life now demanded "expert" wisdom, from the conservation of nature, agricultural development, mining and steel production, or factory administration and office organization to railroad and street building, city planning, welfare programs, the domestic economy, and even leisure and sports. If, as Carolyn Marvin points out, the retooling of American industries from steam to electricity fostered "a new class of managers of machines and techniques" (p. 9), the idea behind such specialization, namely, that in a complex social system science holds a remedy for every individual and societal problem, became a national creed.

This crucial period in American history was also about speed and acceleration, which some saw as the most distinctive characteristics of American society, and about how they became a determining factor in scientific, industrial, and cultural production. The invention of faster means of transportation such as the train, the automobile and, somewhat later, the airplane, decisively altered the way people perceived the world. Physically, these modes shaped and transformed the landscape by requiring an extensive grid of tracks and roads; psychologically, they changed human perception by destabilizing the relation between the fast-moving passenger and the world outside. The impact of these inventions clearly went beyond their physical existence to challenge perceptions of both self and the world. In 1925 the philosopher of science Alfred North Whitehead reflected that "in the past human life was lived in the bullock cart; in the future it will be lived in an aeroplane; and the change of speed amounts to a difference in quality" (p. 137). Yet to many of his contemporaries, firsthand experience with rapid movement had already become commonplace. By the early 1920s they were riding on turbine-driven high-speed trains, eating fast food in streamlined diners, or rushing to work in the latest Ford or GM models. By that time also, Albert Einstein (1879–1955) had articulated his special theory of relativity (1905), predicated on a universe in fast motion, and Frederick Winslow Taylor (1865–1915), in response to what he perceived as the deliberate hampering of the speed of industrial production by inefficient workers, had published The Principles of Scientific Management (1911), a manifesto that turned workers into virtual human machines.

Although much of the modern preoccupation with speed was associated with twentieth-century machine technology, the enthusiasm inspired by airplanes or internal combustion engine automobiles that set new records for speed was anticipated in the final decades of the preceding century by increased attention to rationalization and the shrinking of time and space in practically all areas of American culture. Communication and voice recording are a good case in point. Driven by new scientific insights into the nature of electromagnetic waves (Hans Christian Ørsted, James Clerk Maxwell, Heinrich Rudolph Hertz), experiments in rapid communication finally led to the invention of the telephone (Alexander Graham Bell), the phonograph, and the dictating machine (both developed by Thomas Edison). These inventions in turn facilitated the rapid growth of modern cities, whose administrations and office networks depended on instant communication without the delays of hand-carried messages and documents. The telephone, according to Stephen Kern, "accelerated business transactions by increasing the liquidity of securities and the speed of fundraising. J. P. Morgan averted a financial panic in 1907 when, over the telephone, he extended $25 million credit to several major banks threatened with excessive withdrawals" (p. 114). By 1900 Americans so thoroughly and efficiently put to use the new time-saving electrical devices that one stunned English traveler remarked, "Life in the United States is one perpetual whirl of telephones, telegrams, phonographs, electric bells, motors, lifts, and automatic instruments" (quoted in Mowry, pp. 1–2).

Artists and writers confronted, and quite frequently subverted, the shifting attitudes, mores, and behavioral patterns of Americans living in a culture dominated by science. Far from being a mere reflection of the larger society and its sociohistorical dynamics, art served to register the subtle changes that marked the dawn of a new era and helped to negotiate the tensions accompanying technological progress and the shift in scientific and economic paradigms. What is more, it gave voice to growing concerns about these developments, even while its own styles and formal structures already betrayed the impact and power of the new system. Although the artistic mind often appeared at odds with new trends and changing material conditions, much of modernist art and literature incorporated the new machine culture and its popular representations in photography and film with astonishing ease. What Thomas Eakins (1844–1916), a prominent voice in the nascent realist movement, represented in his famous painting The Gross Clinic (1875) was not just the wonder, awe, and disgust some felt toward the latest progress in anatomy, but also a formal affinity with scientific methods and objectivity. Eakins's emphasis on the human body as an object of study and observation was evidently out of sync with dominant Victorian aesthetics and the painting was consigned to the medical section of the Centennial Exposition in Philadelphia in 1876, rather than the fine arts section.

VISIONS OF THE PAST, ENVISIONING THE FUTURE

Not all writers, however, adopted the new trend toward "authenticity" and realism to articulate their take on the rampant materiality and fast-paced technological progress of America's "Gilded Age." Some used the well-established model of utopian writing, retooling its fantastic elements to launch scathing critiques of the times or to extrapolate a new and better form of society from the wonder-working prospects of contemporary science and technology. One of the most important and enduring examples of a critique is Mark Twain's A Connecticut Yankee in King Arthur's Court (1889); Edward Bellamy's Looking Backward: 2000–1887 (1887) exemplifies the more optimistic view. Both are steeped in the determinist ideologies of Social Darwinism and the "new" behavioral sciences and treat modern technology as a juggernaut, a historical force that cannot be annihilated or reversed. Yet while Bellamy (1850–1898) stresses the inherent capacity of modern technology to "heal" the widespread social tensions that threatened to divide American society at the turn of the century, Twain (1835–1910) envisions a technological apocalypse with fatal consequences for the future of nature and human civilization.

James M. Cox has suggested (p. 89) that Connecticut Yankee occupies in Twain's oeuvre approximately the same position that Pierre occupies in that of Herman Melville's. But although both works followed towering masterpieces (Adventures of Huckleberry Finn and Moby-Dick, respectively), Twain's satire of nineteenth-century ingenuity and entrepreneurship might more easily be compared to Melville's The Confidence-Man (1857), also a disturbing, stylistically and structurally flawed critique of corporate America. Even though in Connecticut Yankee criticism of late-nineteenth-century progressivist ideology is partly defused by the story's humorous narrative frame, Twain's backward-in-time travelogue still stands as one of the most radical and deeply analytical revisions of the American technological dream. On the surface a story about an unlikely clash of cultures and historical periods (sixth-century Britain versus nineteenth-century Connecticut), its thematic scope extends to issues of writing and representation, mass media, communication networks, capitalist economics, ethics in authoritarian society, and, most prominently, science and the mystery of human nature.

The novel centers on how a modern American technophile who, after a head injury, loses consciousness and upon reawakening finds himself amid the knights at King Arthur's court, navigates his cultural alienation and the surplus of scientific knowledge arising from his advanced, nineteenth-century background. Much of its narrative thrust explores the enormous material progress Anglo-Saxon culture had made over the elapsed thirteen hundred years and also engages the changed psychological conditions that such progress entails. What is more, it asks the crucial question: To what degree can these changes be attributed to individual cultural responses? Or are they innate in human beings generally? Especially in the middle parts of the book, where human behavior is thoroughly discussed, Twain juxtaposes diverging concepts of personality that reflect his transitional position between Romantic idealism and late-nineteenth-century behaviorist social determinism.

In one of the most symbolically loaded moments of the text, Hank Morgan, a former superintendent at the Colt arms factory in Hartford, Connecticut, and his sixth-century ally Clarence prepare for final battle with the Roman Catholic church. To this end they rig the cave of their longtime opponent Merlin with modern explosives, turning the site into a veritable technological limbo. What makes this scene important, however, is less the display of fancy weaponry such as a battery of Gatling guns set off by a huge dynamo, or numerous glass cylinder dynamite torpedoes, but the underlying economics of rationalization, which became a major driving force in the history of modern society. Here is Morgan and Clarence's discussion of the rationale of their various contrivances:

"The wires have no ground-connection outside of the cave. . . . Each is grounded independently."

"No-no, that won't do!"

"Why?"

"It's too expensive—uses up force for nothing. You don't want any ground-connection except the one through the negative brush. The other end of every wire must be brought back into the cave and fastened independently, and without any ground-connection. Now, then, observe the economy of it. A cavalry charge hurls itself against the fence; you are using no power, you are spending no money, for there is only one ground-connection till those horses come against the wire; . . . Don't you see?—you are using no energy until it is needed; your lightning is there, and ready, like the load in a gun; but it isn't costing you a cent till you touch it off." (P. 421)

Twain's story abounds in references to eliminating waste and accelerating production, transportation, and communication. While in the above example it seems that the responsibility for ruthless economizing still lies with the person, the rugged, business-minded Yankee, Twain also offers a more universal and much bleaker explanation. Upon finishing his report on how they blow up the cave and its adjacent area, Clarence proudly exclaims: "It's an innocent looking garden, but you let a man start in to hoe it once, and you'll see" (p. 422). Once cultivation begins and human ingenuity is put to work, there is no way to avoid the total destruction of both nature and human beings. Influenced by contemporary social science, in particular the determinism of Hippolyte Taine (1828–1893) and Herbert Spencer (1820–1903), Twain depicts technological societies as essentially destructive and doomed to self-annihilation. The novel ends with a gruesome Armageddon of burnt flesh and a growing pile of dead bodies that foreshadow not only the horrors associated with modern technological warfare but also the ambiguous blessings of yet another late-century invention, the "electric chair."

Whereas in Connecticut Yankee the wedding of science and technology to democratic, republican agendas inexorably leads to disaster (it is, after all, Morgan's deep aversion to the centralist institutions of church and crown that triggers the final carnage), in Edward Bellamy's Looking Backward scientific and technological know-how prove advantageous in creating a truly democratic, Christian socialist society. And where Twain's bleak cultural critique invokes Melville's The Confidence-Man as its literary predecessor, Looking Backward follows in the tradition of American utopian reformist writing, associated with authors as diverse as Benjamin Rush, Thomas Paine, Robert Owen, and Harriet Beecher Stowe. Like them, Bellamy strongly believed that human beings are rational, eternally struggling to better social conditions and, over the long term, to eradicate tensions arising from ill-conceived notions of class and gender. In its celebration of technology put to democratic, socially stabilizing uses, Bellamy's utopian novel is a very American text. It tapped into a powerful desire among Americans at the turn of the twentieth century to eclipse the negativity expressed in Twain's and Charles Dudley Warner's Gilded Age (1873), for example, and to envision instead modern American society in the not too distant future as a "Golden Age," a technological paradise regained.

Although Bellamy initially admits that he "sought to alleviate the instructive quality of the book by casting it in the form of a romantic narrative" (p. 35), Looking Backward is far from a fantastic tale of America's bright technological prospects. As a former lawyer, muckraking journalist, and political activist, Bellamy took great care to address the social unrest and widespread economic fears lamented by most contemporary Americans. To every conceivable social plight, Looking Backward offers a solution, often backed by modern labor- and time-saving devices. It explores the impact on society of mass media such as radio and introduces a monetary system (including, in a surprisingly visionary insight, the use of credit cards) in which money no longer figures as a substitute for an imagined value but as a representative of everyone's true share of the gross national product. By merging the socialist convictions Bellamy encountered during a trip to Germany with progressivist technophilia, scientific determinism, the Puritan work ethic, and a millennialist belief that the promises of the New Testament will be fulfilled on American soil, he hoped to overcome the innate contradictions and paradoxes of end-of-century social and scientific discourse.

One of the tensions arising from an ever-accelerating market economy lay in the creation of large, powerful corporations and the growing influence they brought to bear on politics and the future of the nation. Another lay in the adaptation and redefinition of evolutionary social theory to ensure that it would not lead to the disruption of national consensus or spawn an all-out struggle for survival between rugged individuals. Put another way, in Looking Backward Bellamy tried to remedy nagging social problems with the very instruments that many suspected were playing a crucial role in the creation of these problems. And while he was convinced that much in contemporary American society had gone awry, he still believed that "the army of industry" he envisioned for America's utopian nationalist future would be an "army not alone by virtue of its perfect organization, but by reason also of the ardor of self-devotion which animates its members" (p. 89). His acute awareness of social ills and their underlying economic causes notwithstanding, Bellamy, like many of his nineteenth-century reformist predecessors, turns out to be an indefatigable idealist whose unremitting trust in the natural goodness of human beings led him to design a Christian socialist society where everyone, by dint of diligence in service to the nation, would be granted an appropriate rank and place. Although he abandons formal legislation, his is a system based on a codification of the "law of nature," or, in his own terms, "the logical outcome of the operation of human nature under rational conditions" (pp. 100–101).

Looking Backward exemplifies the ongoing attempt of modern writers and intellectuals to close the widening gap between the specialized spheres of science, technology, and the arts. Bellamy, who considered himself a writer first and only second a political visionary and activist, was careful to design his utopian state as a haven for all professions, including the fine arts, which, according to many of his contemporaries, were of only minor import to the wellbeing and further prosperity of the nation. Time and again, Dr. Leete, the author's mouthpiece and scientific tour guide of the novel, emphasizes the confluence of material and moral evolution, maintaining that what ultimately distinguishes the "new" from the "old" order is the recognition of merit in all fields of original genius, in science and mechanical invention as well as in music, art, writing, and design. As he tells a receptive Julian West, when Americans pushed on to a "new plane of existence with an illimitable vista of progress, their minds were affected in all their faculties. . . . There ensued an era of mechanical invention, scientific discovery, art, musical and literary productiveness to which no previous age of the world offers anything comparable" (p. 128). Significantly, Bellamy's plea for a more encompassing understanding of what it means to be productive coincides with his criticism of the dwindling authority of the "real" in modern capitalist society and the accompanying proliferation of signs that refer only to other signs rather than to material objects (a vicious dilution/delusion of responsibility he saw at work, for example, in the thriving American credit system). While in the new society all business is executed in direct relation to "real" things, in the old money and credit figured as their "misleading representatives" (p. 174). Rereading evolutionary development as a movement toward greater simplicity and thus authenticity, Bellamy expresses a desire for the "real thing" that became constitutive of modernist culture at large. Although this desire frequently conflicted with the harsh social and economic realities of late-nineteenth-century America, in Looking Backward it becomes the cornerstone of a meticulously reformed, "new" society and, in its wake, a more systematic, that is, scientific use of industrial power.

TECHNOLOGY, GENDER, AND THE ARTS AT THE TURN OF THE CENTURY

Few of Bellamy's fellow writers were prepared to view the differences between the reigning scientific materialism and the artistic imagination as anything but insurmountable. This is particularly true of women authors, who often denounced the positivist scientific worldview as in direct opposition to their own agendas. In the classic short story "The Yellow Wall-Paper" (1892), by Charlotte Perkins Gilman (1850–1935), the clash of science, gender, and the imagination is dramatized in such a way that it becomes the epicenter of mutual misunderstanding and discrimination. The story turns on the male "scientific" obsession with order, regularity, discipline, and self-control as personified by the husband/physician of the female protagonist. It also revolves around the imagery of the wallpaper and its elusive, ever-changing patterns, which introduce an elaborate play on essence versus appearance, on the real as a given versus reality as created by the imagination. And while the physician's wife represents the imaginative process, the story illuminates the quashing of the imagination by scientific rationalism and the toll such rationalism takes on a woman who does not submit to its limited, gendered propositions.

Critics repeatedly claimed that Gilman's intricately woven, double-voiced text should be read as a testament to the stamina and willpower of the female writer who, under circumstances adverse to her professional career, continued to confront current Social Darwinian stereotypes about differences between men and women and the social roles suitable for each. Gilman's own life as a mother, writer, and political activist stands as a glaring example of that stamina. A one-time follower of Bellamy and one of the most influential feminist thinkers of the pre–World War I period, Gilman wrote a utopian novel of her own, Herland (1915), based on ideas articulated in her earlier study Women and Economics (1898). Contrary to Bellamy, she believed that to restrict women in their intellectual and professional evolution amounted to hampering the evolution of the species as such. Although she shared his emphasis on the principles of evolution and human progress, hers was a program that included all members of society, regardless of gender or physical ability. Moreover, in "The Yellow Wall-Paper" she repudiated the pervasive positivist worldview (associated with the French sociologist Auguste Comte) by juxtaposing it with the female poetic imagination and female ingenuity.

Late-nineteenth-century feminist writers often conceived of the lingering opposition between science, technology, and art in terms of a struggle between the sexes and the respective social roles traditionally ascribed to each. In contrast to these efforts, two of their male contemporaries, Frank Norris (1870–1902) and Henry Adams (1838–1918), posited a more fundamental and ultimately futile battle between the past and the future, between the waning influence of a bygone era and the sweeping forces of the new century. In the first installment of his "epic of the wheat" trilogy The Octopus (1901), Norris dramatized a gigantic clash between nature and culture driven by the worldwide demand for natural resources and the ensuing ruthless exploitation of them. There is neither good nor bad in this novel of the post-frontier American West, only a universe of relentless forces connected to varying degrees with the economic interests of farmers, railroad corporations, and the political machine. Norris introduces, as alienated outsiders to this naturalist universe, two artist figures: Presley, a hopeless romantic aspiring to write the great poem of the West; and his former friend the shepherd Vanamee, who lives close to nature and has developed a poetic attitude by instinct. The two learn different lessons from the new order, yet they both eventually acknowledge their anachronistic marginality in view of the overriding presence of the machine. While in the rapidly industrializing West the creative powers of the imagination are utterly out of place, the merging of nature, machine, and corporate culture has spawned its own generative force: the dynamics of the marketplace and its sole objective, economic growth. Against the inexorable onrush of social evolution, the fate of the individual, which had long been the focus of artistic activity, is but a grain of sand in the cogs of the great organic machinery that constitutes and drives the world. There is no malevolence, no heroism or sacrifice in Norris's sobering modern American epic, only forces, conditions, laws.

The other American writer who came to a devastating conclusion about the future of art in modern society is Henry Adams. Standing amid the aweinspiring machinery at the Great Exposition of 1900, Adams wrote, in an oft-quoted passage of The Education of Henry Adams (1907), that he "began to feel the forty-foot dynamos as a moral force, much as the early Christians felt the Cross" (p. 380). What he finds especially striking in this new electro-mechanical environment is the smooth, noiseless way that the dynamo wields its enormous power: "Barely murmuring—scarcely humming an audible warning to stand a hair's breadth's further for respect of power—while it would not wake the baby lying close against its frame" (p. 380). The image of the machine lulling a baby to sleep drives home the writer's concern about modern technology and his increasing estrangement from late-nineteenth- or early-twentieth-century American culture. Adams metaphorically describes the ongoing replacement of the creative power of art by the over-whelming reproductive potential of the machine. In the modern scientific-technological environment, as his distancing stance of third-person autobiographical narration suggests, the writer ceases to be a producer and creator and adopts the role of commentator and disconnected observer. By stubbornly clinging to outmoded aesthetic models such as the female procreative power he saw embodied in both the Epicurean Venus and the Christian Virgin, powers that had never been duly acknowledged by Protestant America, this avowedly nineteenth-century author finally surrendered to having "his historical neck broken by the sudden irruption of forces totally new" (p. 382).

To be sure, the authorship, publication, and distribution of literary texts survived well into the days of word processing, e-books, and blogs. Moreover, battles over the role and fate of the writer in a new, fast-changing nation date back at least as far as the American Renaissance and Jacksonian democracy. Yet by fore-grounding issues of creativity and the representation of the real within the technological framework of modern society, both male and female end-of-century writers paved the way for modernist debates. The influential German critic Walter Benjamin (1829–1940) described the state of art at the turn of the century as involving a fundamental shift from originality to repetition, from unique, authentic works of art to mechanically reproduced, dissimulating works. "Around 1900 technical reproduction had reached a standard that not only permitted it to reproduce all transmitted works of art and thus to cause the most profound change in their impact upon the public; it also had captured a place of its own among the artistic processes" (Illuminations, pp. 219–220). One of the key terms of this essay is "aura," the ritual function of art, its ongoing negotiation of distance and presence, authenticity and artificiality. Aura, according to Benjamin, is ontologically connected to the original, unique work of art and for this reason does not permit of reproduction or replication. Benjamin is mainly concerned with painting, photography, and film, yet his general argument applies to written texts as well. With technological progress and the concomitant proliferation of forums for amateur writers (newspapers, professional and special interest magazines, dime novels, serials, and so forth), the distinction between author and reader, between the "real" and the "sham" writer, became increasingly blurred. In the age of mechanical reproduction, as Benjamin explains elsewhere, "the reader is at all times ready to become a writer." Because of the division of labor, the majority of the workforce are "experts" in something (if only in a very circumscribed and specialized area), that is, potential authors: "As an expert—even if not on a subject but only on the post he occupies—he gains access to authorship" (Reflections, p. 225).

MODERNISM AND THE NEW AESTHETICS OF SPEED

Inasmuch as Benjamin attributed the disappearance of traditional authorship to technological progress, he evoked a recurring theme in Western cultural criticism: the widespread anxiety about the loss of authorial control in a scientific-technological environment, which resurfaced with every new and more powerful technology. Yet this is only half the story. Along with such anxieties there had always been an effort to co-opt the scientific and technological paradigms in art. And while Adams's and Norris's negative stances foreshadow the elegiac disgust with contemporary mass society that informs the later works of T. S. Eliot and Ezra Pound, they in no way reflect the enthusiasm about that same modern environment by the early vorticist Pound, by much of popular literature, dime novels, and books for adolescent readers (such as the widely popular Tom Swift series), or by the many writers, poets, painters, photographers, and musicians committed to forging from an increasingly machine-engineered world a new form of aesthetics. To a large degree this new modernist aesthetics is predicated on the encroachment of science and technology into practically every sphere of society, and, as a result, on the ever-accelerating pulse of everyday life. Eager to respond to the fast pace of modern life, authors presented their own view of the formative and deforming power of speed, translating the dynamic potential of urban space into an abstract, kinetic verbal construction.

See alsoCentennial; Darwinism; Health and Medicine; Pseudoscience; Psychology; Transportation; Weaponry

BIBLIOGRAPHY

Primary Works

Adams, Henry. 1907. The Education of Henry Adams. Boston: Houghton Mifflin, 1961.

Bellamy, Edward. 1887. Looking Backward, 2000–1887. Edited by Cecelia Tichi. New York: Penguin, 1982.

Carnegie, Andrew. 1908. Problems of To-day: Wealth, Labor, Socialism. Garden City, N.Y.: Doubleday, 1933.

Gilman, Charlotte Perkins. 1915. Herland. Introduction by Ann J. Lane. New York: Pantheon Books, 1979.

Twain, Mark. 1889. A Connecticut Yankee in King Arthur's Court. Edited by Bernard L. Stein. Berkeley: University of California Press, 1983.

Whitehead, Alfred North. Science and the Modern World. New York: Macmillan, 1925.

Secondary Works

Benjamin, Walter. Illuminations: Essays and Reflection. Edited by Hannah Arendt. Translated by Harry Zohn. New York: Harcourt, Brace, and World, 1968.

Benjamin, Walter. Reflections: Essays, Aphorisms, Autobiographical Writings. Edited by Peter Demetz. Translated by Edmund Jephcott. New York: Harcourt Brace Jovanovich, 1978.

Cox, James M. "A Connecticut Yankee in King Arthur's Court: The Machinery of Self-Preservation." Yale Review 50 (autumn 1960): 89–102.

Cummings, Sherwood. Mark Twain and Science: Adventures of a Mind. Baton Rouge: Louisiana State University Press, 1988.

Golden, Catherine, ed. The Captive Imagination: A Casebook on "The Yellow Wallpaper." New York: Feminist Press, 1992.

Kenner, Hugh. The Mechanic Muse. New York: Oxford University Press, 1987.

Kern, Stephen. 1983. The Culture of Time and Space, 1880–1918. Cambridge, Mass.: Harvard University Press, 2003.

Licht, Walter. Industrializing America: The Nineteenth Century. Baltimore: Johns Hopkins University Press, 1995.

Marvin, Carolyn. When Old Technologies Were New: Thinking about Electric Communication in the Late Nineteenth Century. New York: Oxford University Press, 1988.

Mowry, George E. The Era of Theodore Roosevelt and the Birth of Modern America 1900–1912. New York: Harper, 1958.

Mumford, Lewis. Technics and Civilization. New York: Harcourt Brace, 1934.

Orvell, Miles. The Real Thing: Imitation and Authenticity in American Culture, 1880–1940. Chapel Hill: University of North Carolina Press, 1989.

Pacey, Arnold. The Maze of Ingenuity: Ideas and Idealism in the Development of Technology. 2nd ed. Cambridge, Mass.: MIT, 1992.

Patai, Daphne, ed. Looking Backward, 1988–1888: Essays on Edward Bellamy. Amherst: University of Massachusetts Press, 1988.

Tichi, Cecelia. Shifting Gears: Technology, Literature, Culture in Modernist America. Chapel Hill: University of North Carolina Press, 1987.

United States Bureau of the Census. Historical Statistics of the United States, Colonial Times to 1970. Washington, D.C.: U.S. Department of Commerce, Bureau of the Census, 1975.

Klaus Benesch

Science and Technology

views updated May 11 2018

SCIENCE AND TECHNOLOGY

the interplay between technology and science
drivers and incentives in knowledge production
access and progress
feedback from technology to science
could economic growth have taken place without science?
bibliography

The view that science somehow leads to technology through the model known oddly as the "linear model" fared poorly in the late twentieth century. The linear model has it that in the past, pure science led to applied science, applied science to technology, and from there the path led to engineering and production. Among late-twentieth- and early-twenty-first-century scholars, however, there is consensus: first, that technology drove science at least as much as the reverse; second, that scientific under-standing—whatever that precisely means—is neither a necessary nor a sufficient condition for technological progress; and third, that both are deeply influenced by a host of cultural, social, and economic factors too numerous and in dispute to list here. There is a danger that in their haste to criticize the highly simplified and schematic model, critics will end up without an appreciation of the importance of scientific knowledge in the process of technological and economic development between 1780 and 1914. Every technique has an "epistemic base"—that is, knowledge about natural regularities and phenomena—on which it rests. At times this basis is very narrow or barely even exists; in those cases, the technique in question works, but nobody is quite sure how and why. In other instances, some minimum has to be known before the technique can be realized.

the interplay between technology and science

Perhaps the safest generalization one can make is that there was no single model or straightforward relationship between scientific knowledge and technological practice. Each industry and each practice differed in its need to rely on the formalized knowledge that was still known as "natural philosophy" in 1780. In the ensuing "long nineteenth century" (1789–1914), a large number of important inventions were made that owed little to science. This would include most breakthroughs in the textile industry; some of the canonical inventions that revolutionized the cotton industry were tricky mechanical problems that took mechanical ingenuity to solve, but "science" as such had little to do with their solution. Similarly, the invention of barbed wire by Joseph Farwell Glidden (1813–1906) in 1874, while of substantial significance to the American agricultural economy, owed nothing to science. A common story is that science discovers some phenomenon that can be exploited. The technique that emerges subsequently serves as a focusing device that makes scientists take a closer look, and as they begin to understand the underlying natural processes better and better, they can improve the technique and adapt it to new uses.

The paradigmatic example is of course steam power. By 1780 steam power was on its way to assume a central role in the industrialization and transportation revolution. The "science" behind it was nontrivial: to build an atmospheric engine, one had to know at least that the earth's surface was at the bottom of an atmosphere, whose pressure could be


exploited. James Watt's (1736–1819) improvements to Thomas Newcomen's (1663–1729) steam engine depended in part on the further realization, due to his fellow Scotsman William Cullen (1710–1790), that in a vacuum water boils at much lower, even tepid, temperatures, releasing steam that would ruin the vacuum in a cylinder. Yet "understanding" steam power in a way that would conform to our notions of science was still many decades off: in the 1820s and 1830s, the best theorists of steam power still regarded it as a vapor engine rather than recognizing it for the heat engine it was. Inspired and focused by the steam engines they observed, the great theorists of thermodynamics such as Sadi Carnot (1796–1832) and James Prescott Joule (1818–1889) finally formulated the science of thermodynamics. Technology did not "depend" on science, but better science could improve it to the point where the productivity growth due to continuous improvements drove economic growth.

Another example is the electromagnetic telegraph, one of the truly transforming inventions of the nineteenth century. Here, too, some science was necessary to make it possible. In this case, it was the discovery that electricity and magnetism were related after all (something that had been in serious doubt). In 1819 a Danish physicist, Hans Christian Oersted (1777–1851), brought a compass needle near a wire through which a current was passing. It forced the needle to point at a right angle to the current. A number of scientists put their mind to the problem, and by the mid-1830s Joseph Henry (1797–1878) and others realized that an electro-magnetic telegraph was possible, and by 1837 the device was shown to work.

Yet the epistemic base was still quite narrow, and it took the genius and energy of William Thomson (1824–1907, later Lord Kelvin) to work out the principles governing the relation between the signal and the resistance, inductive capacity, and length, and to compute the resistivity of copper and the inductive capacity of gutta-percha, the insulating material. He used his knowledge to invent a special galvanometer, a siphon recorder (which automatically registered signals), and a technique of sending short reverse pulses immediately following the main pulse to sharpen the signal. These inventions were based on best-practice mathematical physics, and although the epistemic base was far from complete (Kelvin resisted the electromagnetics of James Clerk Maxwell [1831–1879] and held on to the notion of ether, believed to be the weightless medium for the transmission of electromagnetic waves), they improved the telegraph in every direction.

A third example of the subtle interplay between science and technology in the nineteenth century is found in soil chemistry. Since antiquity, farmers had realized that they could improve agricultural output by adding certain substances to the soil. Among those substances, animal manure and marl were widely used. Nobody, of course, quite under-stood how and why these procedures worked, and as a result progress in agricultural productivity was limited when judged by the standards of later development. By 1800 agricultural writers were busy cataloging what kind of substances worked on which soils and for what crops, but the epistemic base this practice remained rather narrow and consisted mostly of empirical patterns that these writers thought they were observing. However, the closing decades of the eighteenth century saw the rise of modern chemistry, and by the 1820s and 1830s, German chemists led by Friedrich Wöhler (1800–1882) and Justus von Liebig (1803–1873) discovered what today is called organic chemistry and realized that it helped them understand why certain substances such as phosphates and nitrates improved agricultural productivity. By midcentury, the important role of various chemical substances was better understood, and European farmers began to apply potash and nitrates to their soils. The greatest triumph of science was beyond question the distillation of ammonia from the atmosphere: nitrates were recognized as an essential ingredient in both fertilizers and explosives, yet although most of the atmosphere consists of nitrogen, it was not known how to extract it. Fritz Haber (1868–1934) and Carl Bosch (1874–1940) solved this problem around 1910. Both were highly trained professional chemists, yet their process still relied on a great deal of trial-and-error research.

Perhaps nowhere are the complexities of the relation between science and technology better illustrated than in medical technology. The growth of medical science was unusually slow. It is not an exaggeration to point out that by 1800, medical science had developed little beyond the great medical writers of antiquity. Theories of disease were confused and mutually contradictory, and the ability of science to prevent, let alone cure, often-fatal infectious diseases was negligible. This started to change in the early nineteenth century due to two major developments. The first is the recognition that relatively poorly understood natural phenomena can be analyzed by means of statistical data. On that account, for instance, it became clear through the research of the French physician Pierre-Charles-Alexandre Louis (1787–1872) around 1840 that bleeding ill patients did little to improve their health, and (through the work of British physicians such as John Snow [1813–1858] and William Budd [1811–1880] in the 1850s) that water that appeared and tasted clean could still transmit deadly diseases. As a result, a great deal of effort was directed toward filtering the water supply and separating drinking


water from waste decades before the actual epistemic base of infectious diseases was established by Louis Pasteur (1822–1895) and Robert Koch (1843–1910) in the 1860s and 1870s. Following Pasteur and Koch, however, it not only became clear how and why Louis and Snow had been correct, but also how to apply this knowledge to further advance private and public health through preventive medicine. Pasteur's science helped change and improve the technology of surgery as much as it improved that of food canning—even if both had existed before his work.

drivers and incentives in knowledge production

The model that scientific knowledge somehow "leads" to technology is an oversimplification for another reason as well. Science is more than just the formal and consensual knowledge familiar to the twenty-first century. The heritage of the eighteenth century to the modern age and the taproot of technological progress and economic growth was a radically different view of why and how science should be practiced. Curiosity and "wisdom" had to make room for another set of motives—namely, the growing conviction that understanding nature was the key to controlling it, and that controlling nature was in turn the key to technological and economic progress.

This attitude, often traced back to Francis Bacon (1561–1626), became more and more influential in the eighteenth century. It involved the separation of scientific practice from religion, the belief that nature was orderly, and that natural laws, once properly formulated, were universal with no exceptions (i.e., magic). It involved major cultural changes, above all the practice of "open science" (that is, placing scientific findings in the public realm), that had emerged during the Renaissance but only became unequivocally established during the second half of the seventeenth century. Open science did two things. First, it made scientific knowledge available to those who might be able to use it. Second, it increased the credibility of scientific knowledge by exposing it to the scrutiny and criticism of other experts. It was widely believed—often somewhat over-optimistically—that once scientific claims had been exposed to the rest of the world, those that survived must be correct. Scientists were rewarded by fame, prestige, and at times comfortable and secure positions, but they sought credit, not profit.

By 1780 the realm of useful knowledge had bifurcated into knowledge that was "propositional" (including science, mathematics, geography, and a catalog of successful techniques) in that it stated discoveries about nature and placed them in the public realm, and knowledge that was "prescriptive," that is, provided the actual instructions on how to produce. The latter kind of knowledge was increasingly driven by profit motives, and for it to keep expanding, it needed to secure a way of compensating inventors for their efforts and investments. This could be (and was) done in a variety of ways. One was to secure intellectual property rights in the form of patents, which would place the knowledge in the public realm but prohibit its exploitation without the permission of the patentee. The second was to keep the invention secret, a strategy that could work at best only if the innovation could not be reverse-engineered. The third was to reward the inventor through some formal government body that assessed the value to society of this knowledge and paid the inventor from the public treasury. Finally, a few inventors simply relied on the advantage of being the first mover; they knew they would be imitated but hoped to make enough money simply by getting there first.

access and progress

In any event, the central factor in the growth of technology in the period from 1780 to 1914 was the continuous improvement in the access to useful knowledge. Knowledge meant both power and prosperity, but only if it could be accessed by those best able to exploit it. At times, of course, scientists rolled up their sleeves and applied their knowledge to new techniques themselves. The modern-age specialization between ivory-tower theorists and practically minded engineers and inventors (more of a stereotype than a reality even in the twenty-first century) was comparatively rare in the period of the First Industrial Revolution. Many theorists and experimentalists became interested in and solved applied production problems. The great chemist Humphry Davy (1778–1829), to cite one example, invented the mining safety lamp, wrote a textbook on agricultural chemistry, and discovered that a tropical plant named catechu was a useful additive to tanning. His colleague Benjamin Thompson (Count Rumford, 1753–1814) was most famous for the proof that heat is not a liquid (known as "caloric") that flows in and out of substances. Yet Rumford was deeply interested in technology, helped establish the first steam engines in Bavaria, and invented (among other things) the drip percolator coffeemaker, a smokeless-chimney Rumford stove, and an improved oil lamp. In the later nineteenth century, the physicist Joseph Henry (1797–1878) probably can make a good claim to being the inventor of the electromagnetic telegraph, and Lord Kelvin owned dozens of patents.

Communication between scientists and manufacturers became a matter of routine in the late eighteenth and nineteenth centuries. Such access is essential if the growth in useful knowledge is to have economic consequences. Early on, such contact often took place in meeting places and scientific societies that became typical of Enlightenment Europe. Of those, most famous were the Birmingham Lunar Society, in which manufacturers such as Josiah Wedgwood (1730–1795) and Matthew Boulton (1728–1809) picked the brains of scientists such as Joseph Priestley (1733–1804) and Erasmus Darwin (1731–1802), and the London Chapter Coffee House, which boasted a similarly distinguished clientele. The Royal Institution, founded in 1800, provided public lectures for the general public.

During the nineteenth century, the number of forums in which manufacturers and engineers could meet and communicate with scientists increased rapidly. Scientists were often retained as consultants and inventors. The German chemical


and electrical firms, which carried out a substantial amount of the research and development that created the Second Industrial Revolution, often retained university professors who practiced a "revolving door" kind of career between their academic and industrial jobs. Thomas Edison (1847–1931), whose knowledge of science was intuitive rather than formal, employed a number of highly trained scientists with whom he consulted, though at times he wisely chose to ignore their advice. Yet what has to be realized is that personal contact was only necessary insofar that knowledge could not be codified—that is, described and depicted in words or pictures.

The proliferation of scientific and technological literature in the nineteenth century was simply enormous. This proliferation took the form of encyclopedias, textbooks, manuals, as well as scientific periodicals of many varieties. Libraries sprang up everywhere and the declining real price of books and printed matter made for an ever-growing accessibility of scientific and mathematical knowledge to those who could make use of it. Equally important, engineering education became increasingly science-based. In the French grandes écoles and in the German universities, mining academies, and technical colleges, formal science became part of the education of even midlevel technicians. Inventing remained, as it is in the twenty-first century, open to "tinkerers" such as Sir Henry Bessemer (1813–1898), Sidney Gilchrist Thomas (1850–1885), and Edison. Yet their inventions, no matter how brilliant, only worked because they were subsequently refined and improved by people well trained in the relevant science.

Of course, some classic inventions originally were simply mechanical. The zipper (patented in 1893 by Whitcomb Judson) and paper clips (introduced by the Gem company in Britain in the 1890s) were much like barbed wire, simple and useful ideas that needed no science. But even in many cases of simple inventions, knowledge of the finer details of metallurgy, electricity, or mass production engineering was needed for further development.

feedback from technology to science

The interplay of science and technology in the nineteenth century was bidirectional and can be viewed as positive feedback in the sense that technology helped science just as science helped technology. Such positive feedback mechanisms often lead to unstable systems that never converge to a given position. While such a view may be unsettling to scholars who like to think of the world as inherently stable and predictable, it is perhaps not an inappropriate way of viewing the historical process of technological change from 1780 to 1914, a period that displays continuous unpredictable change as its most enduring feature.

The ways in which technology affected science can be viewed in three broad categories. First, as already been shown, technological practices directed and focused the interests of researchers to discover how and why they worked. The search for the deep nature of electricity spurred the work of such scientists as Svante August Arrhenius (1859–1927), George Johnstone Stoney (1826–1911), and Sir Joseph John Thomson (1856–1940), leading to the discovery of the electron. It is almost comical to contemplate Thomson's alleged toast at an event celebrating his Nobel Prize in physics: "Here's to the electron, may no one ever find a use for it." By that time, of course, electrical lighting and appliances were ubiquitous. The practice of food canning, invented by Nicolas Appert (1749–1841) in 1795, stimulated Pasteur into his famous studies of putrefaction.

Or consider geology: the need to develop a better method to prospect for coal inspired William Smith (1769–1839) toward a growing understanding of geology and the ability to identify and describe strata on the basis of the fossils found in them. The idea (already widely diffused on the Continent but unknown to Smith) that there were strong natural regularities in the way geological strata were layered led to the first geological maps, including Smith's celebrated Geologic Map of England and Wales with Part of Scotland (1815), a "map that changed the world." It increased the epistemic base on which mining and prospecting for coal rested. One can track with precision where and through which institutions this interaction between propositional and prescriptive knowledge took place and the institutional environment that made them possible. Although the marriage between geology and mining took a long time to yield results, the widening epistemic base in nineteenth-century mining technology surely was the reason that the many alarms that Britain was exhausting its coal supplies turned out to be false.

Technology also stimulated science by allowing it to carry out new research. The extent to which science was constrained by instruments and tools is rarely fully appreciated. Astronomy, it has often been observed, entered a new age the day that Galileo Galilei (1564–1642) aimed his brand-new telescope toward the sky. Microscopy had a similar effect on the world of microorganisms. The invention of the modern compound microscope by Joseph Jackson Lister (1786–1869, father of the surgeon) in 1830 serves as another good example. Lister was an amateur optician, whose revolutionary method of grinding lenses greatly improved image resolution by eliminating spherical aberrations. His invention changed microscopy from an amusing diversion to a serious scientific endeavor and eventually allowed Pasteur, Koch, and their disciples to refute spontaneous generation and to establish the germ theory. The chemical revolution initiated by Antoine Laurent Lavoisier (1743–1794) and his French collaborators might not have achieved such a triumph had he not been equipped with unusually precise instruments. The famous mathematician Pierre-Simon de Laplace (1749–1827) was also a skilled designer of equipment and helped to build the calorimeter that resulted in the celebrated Memoir on Heat by Laplace and Lavoisier (1783), in which respiration was identified as analogous to burning. Much of the late-eighteenth-century chemical revolution was made possible by new instruments such as Alessandro Volta's (1745–1827) eudiometer, a glass container with two electrodes intended to measure the content of air, used by Henry Cavendish (1731–1810) to show the nature of water as a compound.

Perhaps the classic case of an invention that enabled scientific progress was the Voltaic Pile, the first battery that produced continuous current, invented by Volta in 1800. Through the new tool of electrolysis, pioneered by William Nicholson (1753–1815) and Davy, chemists were able to isolate element after element and fill in much of the detail in the maps whose rough contours had been sketched by Lavoisier and John Dalton (1766–1844). Volta's pile, as Davy put it, acted as an "alarm bell to experimenters in every part of Europe." Electrochemistry became the tool with which much of the chemical revolution was placed on a firm and systematic footing. For instance, Davy established that chlorine, the miraculous bleaching substance that played such a major role in the new cotton industry, was an element and not a compound.

Finally, technology often made it possible to verify and test scientific hypotheses and to decide scientific controversies. Much science is the subject of endless debate, and nothing will settle a scientific debate as effectively as a demonstrable useful application. The success of Koch and his followers in identifying a host of bacterial pathogens and the subsequent advances in public and private health helped wipe out whatever doubt there remained about the validity of the germ theory. Heinrich Rudolph Hertz's (1857–1894) work on oscillating sparks in the 1880s and the subsequent development of wireless communications by Sir Oliver Joseph Lodge (1851–1940) confirmed Maxwell's purely theoretical work on electromagnetic fields.

Most decisively, the success of the Wright brothers at Kitty Hawk in 1903 resolved the dispute among physicists on whether heavier-than-air machines were feasible. In 1901 the astronomer and mathematician Simon Newcomb (1835–1909, the first American since Benjamin Franklin [1706–1790] to be elected to the Institute of France) had still opined that flight carrying anything


more than "an insect" would be impossible. Here, too, theory and practice worked cheek-by-jowl. The Wright brothers worked closely with Octave Chanute (1832–1910), the leading aeronautical engineer of the age. Yet it was only following their successful flight that Ludwig Prandtl (1875–1953) published his magisterial work on how to compute airplane lift and drag using rigorous methods.

could economic growth have taken place without science?

It is often argued that the First Industrial Revolution (1760–1830) owed little to formal science. Most of the pathbreaking inventions such as Sir Richard Arkwright's (1732–1792) throstle (1769) or Henry Cort's (1740–1800) puddling and rolling technique (1785) were independent of the scientific advances of the age. While it is easy to show scientific progress during the Industrial Revolution, it is not easy to come up with many mechanical inventions that required advanced scientific knowledge as such. There are, of course, a few such instances, but before the middle of the nineteenth century they were not the rule.

In other words, scientific knowledge before 1850 was an input in innovation, but its importance was not decisive. Perhaps the best way of thinking about it is to realize that in addition to science affecting technology and vice versa, both were affected by a culture of growing material rationalism associated with the Industrial Enlightenment. It is interesting, however, to note that the major inventors of the time increasingly felt that they needed science and scientists to help them. Watt's milieu of scientists in Glasgow (and later Birmingham), Wedgwood's prodding of scientists (including Lavoisier himself) for advice on the technical problems that came up with his pottery, or the obsession of Leeds woolen manufacturer Benjamin Gott (1762–1840) with scientific experiment and chemistry demonstrate that such beliefs were widespread, at least in the circles that counted most.

As the nineteenth century advanced, such expectations were increasingly realized. One of the less well-known consequences of the chemical revolution is the work of the French chemist Michel-Eugène Chevreul (1786–1889), who discovered in the 1820s the nature of fatty acids and turned the manufacture of soap and candles from an art into a science. As director of dyeing at the Manufacture des Gobelins, he had a direct interest in the chemistry of dyes and colors. The original work on the chemistry of dyeing had been carried out by his predecessor at the Gobelins, Claude-Louis Berthollet (1748–1822, famous for the discovery of the bleaching properties of chlorine), but his work had been cut short by his political activities and it fell to Chevreul to realize his program.

The progress of steel, one of the truly central inventions that heralded in the Second Industrial Revolution, too, depended on science more than the usual story of the invention of the Bessemer process suggests. The epistemic base of steelmaking at the time was larger than Sir Henry Bessemer's (1813–1898) knowledge. This was demonstrated when an experienced and trained metallurgist named Robert Forester Mushet (1811–1891) showed that Bessemer steel contained excess oxygen, a problem that could be remedied by adding a decarburizer consisting of a mixture of manganese, carbon, and iron. In the years following Bessemer and Mushet's work, the Siemens Martin steelmaking process was perfected, and Henry Clifton Sorby (1826–1908) discovered the changes in crystals in iron upon hardening and related the trace quantities of carbon and other constituents to the qualities and hardness of steel. Steelmaking may not have been "scientific" in the modern sense of the word, but without the growing science of metallurgy, its advance eventually would have been stunted.

Economic growth can take place in the absence of advances in knowledge, but when it does so, it usually is more vulnerable and harder to sustain over long periods and large areas. When it is based on advances in knowledge, it is much less likely to be reversed. The twentieth century made a serious attempt to return to barbarism and to undo the advances of the years from 1780 to 1914, but in the end progress was resumed and has led to the stupefying growth in riches of the post-1950 decades.

The "Great Synergy," as Vaclav Smil has referred to it, between science and technology (or perhaps between propositional and prescriptive knowledge) is the central event of modern European history. It led to sustained and irreversible gains in productivity and quality of life, to the doubling of life expectancy, and to the realization of lifestyles that in 1780 must have seemed unimaginable in their material comfort and opulence.

See alsoAgricultural Revolution; Industrial Revolution, Second.

bibliography

Headrick, Daniel R. When Information Came of Age: Technologies of Knowledge in the Age of Reason and Revolution, 1700–1850. Oxford, U.K., 2000.

Jacob, Margaret C. Scientific Culture and the Making of the Industrial West. New York, 1997.

Jacob, Margaret C., and Larry Stewart. Practical Matter: Newton's Science in the Service of Industry and Empire, 1687–1851. Cambridge, Mass., 2004.

Mokyr, Joel. The Lever of Riches: Technological Creativity and Economic Progress. New York, 1990.

——. The Gifts of Athena: Historical Origins of the Knowledge Economy. Princeton, N.J., 2002.

——. "The Intellectual Origins of Modern Economic Growth." Journal of Economic History 65, no. 2 (2005): 285–351.

Musson, A. E., and Eric Robinson. Science and Technology in the Industrial Revolution. Manchester, U.K., 1969.

Petrosky, Henry. Invention by Design: How Engineers Get from Thought to Thing. Cambridge, Mass., 1996.

Rosenberg, Nathan. Perspectives on Technology. Cambridge, U.K., 1976.

——. "How Exogenous Is Science?" In his Inside the Black Box: Technology and Economics. Cambridge, U.K., 1982.

Smil, Vaclav. Creating the Twentieth Century: Technical Innovations of 1867–1914 and Their Lasting Impact. New York, 2005.

Smith, Crosbie, and M. Norton Wise. Energy and Empire: A Biographical Study of Lord Kelvin. Cambridge, U.K., 1989.

Winchester, Simon. The Map That Changed the World: The Tale of William Smith and the Birth of a New Science. London, 2001.

Joel Mokyr

Science and Technology

views updated May 18 2018

Science and Technology

The connections between science, technology, and Western colonialism are strong and complex. The connections were driven and shaped by the European scientific revolution of the seventeenth century, as well as the growing authority of science in the eighteenth century Enlightenment period. Together, these developments established a modern mentality of dominance and expansion, which differed significantly from the premodern period. The new methods of science drove and seemed to vindicate humankind's dominance over, and knowledge of, nature. This ambition frequently translated into exploration, expansion of territory, and consolidation of European authority over indigenous people.

There are four domains of activity where scientific and technological developments intersected most clearly with Western colonialism. First, ever-changing technologies of travel both facilitated and encouraged exploration and territorial expansion. These related both to ocean travel and land travel, in particular the railroad. Second, communication technologies evolved rapidly, especially in the nineteenth century, linking continents and people in novel ways. Innovations in transport and communication in the eighteenth and nineteenth centuries were spurred by industrializing Britain, and later France, Germany, and the United States. Crucial new technologies were created, themselves requiring extensive circuits of colonial trade in raw materials.

The third domain involves scientific advancements in the field of medicine and health care. Western colonialism created vast medical problems of illness, especially epidemics of infectious disease in indigenous communities. But conversely and paradoxically, one of the driving forces of Western colonialism came to be an apparently curing and caring one, whereby Western hygiene and public health were understood to be one of the great benefits brought to different parts of the world. Fourth, Western science and technology facilitated the development of new arms and weapons. While the contest of arms between colonizers and the colonized was not always as one-sided as might be expected, firearms technology permitted colonization of local people, often in the most brutal way. Differential arms technology also determined the outcome of territorial wars between colonial powers. Since 1450, the European idea of progress has applied to scientific knowledge, imperial territorial, military and administrative expansion, and the increasingly dominant adherence to a Western "civilizing" mission.

TRAVEL AND TRANSPORTATION

Technologies of transport and travel have enabled and shaped Western colonialism from the Renaissance period onward. The Iberian powers of the Mediterranean and the Atlantic honed sailing and navigating skills for military and fishing purposes over many generations. Square sails were increasingly used alongside lateen sails, an innovation from the Islamic world, which permitted ships to beat into the wind. Spanish and Portuguese sailors in particular developed skills, knowledge, and technology for increasingly wide Atlantic voyages, to the Cape Verde Islands, the Madeiras, and the Canary Islands; along the African coast; and to the Americas.

Technology to make great ocean voyages was within the grasp of not just the Europeans, however. Chinese navigation and shipping knowledge was comparable in the early period, and Polynesian cultures made long Pacific voyages, between the Hawaiian Islands and Aotearoa/New Zealand, for example. In the sixteenth-and seventeenth-century Atlantic, and in the eighteenth-century Pacific, European explorers, traders, missionaries, and military often adopted and adapted local means of transport, especially inland. Hudson Bay Company traders around the North American Great Lakes, for example, typically traveled by canoe. However, European navigating, sailing, mapping, and shipbuilding technology incrementally increased over many generations, facilitating the establishment of seasonal coastal trading posts, the permanent plantation settlements, and the commercial endeavors of the Atlantic: slavery and the sugar, tobacco, and fur trades.

Steam power and iron were the twin innovations of the British industrial revolution, and both revolutionized transportation, in turn shaping events in the colonial world. In the nineteenth century, there was a transition away from wooden to iron-hulled ships. With so many European forests denuded, and British shipbuilding largely importing timber, iron offered many advantages for the shipbuilding industries. Wrought iron ships weighed far less, were more durable and the design of ships—their possible size and shape—was more flexible. From the late 1870s, a further transition from wrought iron to steel made ships lighter and more adaptable again.

The nineteenth-century transition from sail to steam affected both oceanic transport and river transport, and facilitated Western exploration of interior African, Asian, and American waterways. Especially in the African continent, steamships made travel possible deep inside a region previously largely closed to Europeans. Developing from the navigation of the Hudson River in New York in 1807, the steam-powered ship appeared in colonial contexts from the 1820s, especially in India, in British movement around China, and later in Africa.

As early as the 1830s, steamers were regularly carrying passengers and freight along the Ganges, and steamers came to be central to various military encounters, for example in the war between the British East India Company and the Kingdom of Burma from 1824. Steam navigation between Britain and India soon also became a reality. The British government invested in exploratory navigation by steamer across two overland routes: via Mesopotamia and via Egypt, the latter becoming the main route after the Suez Canal opened in 1869. Hybrid steam/sail transport across the Atlantic was possible from 1819, and from 1832 by steam alone. While the steamers did not entirely eclipse sail, especially for navies, they nonetheless reduced travel time considerably between west and east, north and south.

The development of railroads in the 1840s and 1850s consolidated internal expansion and investment in many areas, tying western and colonial economies ever more tightly. In Britain, for example, railroads interested the Lancashire cotton industry in particular, which sought rapid access to cotton-growing districts, as well as to Indian consumers of cotton garments. By the 1860s and 1870s, railway lines criss-crossed the Indian subcontinent. This involved building large bridges across frequently flooding rivers, themselves considerable engineering feats. In the same period, the transcontinental railroads spanned North America from east to west, bringing an infrastructure and a cultural and administrative permanence to territory and people, who had previously been in a more ambiguous and flexible frontier relationship with colonizers. Thus while railroads often brought easy transport, commercial reliability, and predictability to colonial sites, it was usually at the expense of local trade, communications, economies, and cultures.

COMMUNICATION

Part of the drive for quicker transportation was to speed up communication services between colonial peripheries and centers. By sail, letters between, for example, France and Indochina, between Britain and the Straits Settlements, took months, on both outward and return voyages. The steamer revolution steadily shortened this over the nineteenth century, and steam companies competitively coveted much-sought government contracts to deliver mail. For example, the Peninsular and Orient Steam Navigation Company (P&O) won the contract to deliver mail from Britain to Gibraltar and then to Alexandria in Egypt, connecting with the Indian Navy's mail service from Bombay to Suez, where mail was transported between seas on camels. The Suez Canal, opening in 1869, was largely a French initiative. It was impressive less in terms of engineering technology, than in terms of scale and significance. Built mainly by Egyptians, it was used largely by British ships. The territorial acquisition of Egypt in 1882 by the British was almost entirely about strategically securing the crucial Suez Canal route.

It was the technology of cable telegraphy—first land, and then submarine—that enabled even quicker communication. By 1865 a cable linked Britain with India, but ran across land, through much non-British territory. Land cables could always be sabotaged and cut, and it was not until a new line was laid in 1870—mainly submarine from Britain to Gibraltar, Malta, and Alexandria and then to Suez and India—that telegraph between Britain and India was rapid and reliable.

French colonies were also increasingly linked by telegraphy, with a line laid between France and Algeria in 1879. The increasing reliance on cables for communication was occasionally the rationale for gaining control of territory. At other points it was crucial for communication in times of war, for example the Anglo-Boer War. Telegraphy reduced global communication time from weeks and months, to hours and days, thus promoting and enabling ever-expanding trade and business around the colonial world of the late nineteenth century.

Cables and telegraphs were interrupted as a technology by the use of radio waves and wireless communication in the early twentieth century. In 1901 the first radio waves were transmitted across the Atlantic, from Cornwall to Newfoundland. Soon after, wireless stations appeared in the British, French, and German colonies, often with the ambition to create seamless "wireless chains" around the empires. After 1924 shortwave transmission gave another burst of energy to imperial telecommunications, bringing the most isolated places within instant reach. Much cable telegraphy business had switched to shortwave wireless communication by the late 1920s.

MEDICINE AND HEALTH

Questions of health and medicine were linked to the colonial enterprise from the outset. As soon as Europeans crossed the Atlantic, and explored and colonized the lands and people of Central America and the Caribbean, high mortality and illness rates became evident. Because of this experience of mortality, and because of longstanding climatic understanding of health and disease, in which elements of heat, moisture, air, and environment were seen to be causative, a new field of medicine and science emerged. Initially under the rubric of "the diseases of warm climates," the discipline of tropical medicine arose explicitly from the colonial experience. To some extent, this colonial medicine was concerned with the mortality of Africans on the slave ships and on American plantations, often less for humanitarian than commercial reasons. In the main, however, the concern was to reduce the massive mortality rates of European military, settlers, missionaries, and travellers.

The colonial advance from the sixteenth century onward brought hitherto unknown microbes to the New World and brought others back to Europe, an interaction sometimes called the Columbian exchange. The demographic effects between colonizer and colonized populations were vastly different, however. For the Aztecs and Maya of Central America, for the Hawaiians, and for the Eora of eastern Australia, epidemics of infectious diseases meant illness, death, and often rapid depopulation. Smallpox and tuberculosis killed some people, diseases such as syphilis and gonorrhoea frequently rendered others infertile, seriously altering patterns of reproduction and population replacement. Moreover, the massive changes in land use that often accompanied European colonization seriously compromised indigenous people's health through hunger and starvation, thus unravelling the viability of traditional social and political organization.

Western colonialism, then, created health and medical problems for Europeans, for indigenous people, and for the growing diasporas of people in forced and free migration. But colonialism was also driven by a desire to ameliorate these problems, and increasingly so over the centuries. Thus, for example, if the Hudson Bay Company traders brought smallpox—both wittingly and unwittingly—they also sometimes brought the technology and the material of the smallpox vaccine. Often practical assistance with health and hygiene were the first moves made by colonial missionaries around the world. Western and Christian health care undoubtedly relieved some suffering, but it was also political: it was a means of buying goodwill and, not infrequently, dependence and obligation. By the nineteenth century, when European, North American, and Australasian governments were developing public health bureaucracies and infrastructures in their home countries, the extension of hygiene as rationale for colonial rule of indigenous people became increasingly common.

Pharmacological developments also had a mutual relationship to colonialism, both deriving from and facilitating European expansion. The anti-malarial drug quinine is one example. Local people in the Andes had long recognized the curative and preventive properties of the bark of the cinchona tree. Jesuits brought the bark to Europe in the seventeenth century, and thereafter securing sources of the bark was one reason for increasingly penetrating journeys into the region. Prompted by the need to reduce death rates from malaria in the military, French scientists successfully extracted quinine from the cinchona bark in 1820, undertook experimental research in Algeria, and began commercial production.

Thereafter, large quantities of quinine as anti-malarial prophylaxis were widely used, especially by British and French troops in tropical colonies. Its well-known efficacy clearly assisted British, U.S. and German explorations through Central Africa, and enabled a more permanent French presence in both North and West Africa. Mortality rates for European military as well as civilian populations in colonies began to fall dramatically, and the demand for the bark grew accordingly. This in turn spurred other colonial initiatives to cultivate the tree outside the Andes. The Netherlands East Indies government grew it successfully in Java, as did the British government in India. In the case of malaria, the cinchona tree, and quinine, colonialism created the conditions both of demand and supply.

ARMS

The history of colonialism is also the history of war. Armed combat took place between invading colonizers and indigenous people, from the Spanish invasion of central America, to the British expansion into the Australian continent, to Germans in the Congo. It also took place between competing colonial powers, often involving local people as well. The wars between the French and British in North America through the eighteenth century, for example, were consistently about securing territory on that continent. Especially from the middle of the nineteenth century, the technologies of firearms, and what historians sometimes call the arms gap, decided outcomes.

The arms gap sometimes enabled the massacre of local people by colonizers, with or without government consent. For example, the mass killing of the Kenyan Mau Mau rebels and civilians by the British after 1952 took the force it did partly because of technology available. But it was not always the case that those with firearms were at an unquestioned advantage in colonial wars. For example, when British colonists came to settle in Sydney from 1788, they were often anxious about the spearing skills of the local men, used both to kill and for ritual punishment. The muzzle-loading muskets, which British soldiers and settlers held in that instance, took around one minute to load, and they had to be kept dry. They were simply not always a match for spearing technology. Nor was it consistently the case that colonized people were without firearms. People long involved in the slave trade in Africa, for example, were often armed with muskets and ammunition. The exchange of slaves for firearms was a basic one in that commercial circuit, although often the crudest and cheapest kind of firearm was bartered.

In another example, the Cree in present-day Canada exchanged furs for guns, dealing as middlemen between the Hudson Bay Company and other Native groups, who gradually incorporated traded firearms into their way of life. The world of colonialism was constantly involved in firearms dealing.

Partly because of this trade, and driven by the demands of Western warfare, firearms technology gained pace in the nineteenth century. The invention of the small metal cap for explosives meant that after 1814 the imperative to keep muskets dry was minimized. New oblong bullets were invented in France in 1848, and were tested in the colonies: the French used these bullets first in Algeria, and the British against the Xhosa in the Kaffir War of 1851–1852. Around the 1860s there was a crucial technological shift from muzzleloaders to breechloaders. It was the breechloading gun that created a major discrepancy in power between those with and those without. The American-developed "repeating rifle" and the Maxim, invented by Hiram S. Maxim in 1884, only increased this discrepancy in power. The Maxim was light and could shoot multiple bullets each second. The explorer of Africa, Henry Morton Stanley, had a Maxim gun on his 1886–1888 expedition, as did Lord Kitchener in his conquest of the Sudan in 1898. Both used the gun to achieve their respective colonizing goals.

Knowledge, technology, and power go together. The history of colonialism is a history of often vastly different knowledge systems encountering one other. It is a history of competing, transferring, and evolving technologies. And it is a history of power relations, not always expressed physically and technologically, but frequently so. Major changes in the European world from the Renaissance onward, including the scientific revolution, the development of mercantile capitalism, the industrial revolution, and the communications revolution—all occurred in the era of colonialism, not incidentally, but relatedly. The search for a newly valued scientific knowledge itself explicitly drove many European expeditions, especially in the eighteenth century. The development of technology often facilitated new places and means of travel, exploration, and colonization. Sometimes, science and technology were actively employed to rationalize extended colonial rule of people and territory, under a humanitarian and civilizing logic. Always, technologies established new Western infrastructures in foreign places, which created a momentum of exponential expansion for trade, commerce, government, and settlement.

see also China, Foreign Trade; Railroads, Imperialism; Sugar Cultivation and Trade; Tobacco Cultivation and Trade.

BIBLIOGRAPHY

Ballantyne, Tony, ed. Science, Empire, and the European Exploration of the Pacific. Burlington, VT: Ashgate, 2004.

Chaplin, Joyce E. Subject Matter: Technology, the Body, and Science on the Anglo-American Frontier, 1500–1676. Cambridge, MA: Harvard University Press, 2001.

Crosby, Alfred W. Ecological Imperialism: The Biological Expansion of Europe. Cambridge, MA: Cambridge University Press, 2004.

Curtin, Philip. Death by Migration: Europe's Encounter with the Tropical World in the Nineteenth Century. Cambridge, U.K.: Cambridge University Press, 1989.

Headrick, Daniel R. Tools of Empire: Technology and European Imperialism in the Nineteenth Century. New York: Oxford University Press, 1981.

Headrick, Daniel R. The Tentacles of Progress: Technology Transfer in the Age of Imperialism, 1850–1940. New York: Oxford University Press, 1988.

Hobsbawm, E. J. Industry and Empire: From 1750 to the Present Day. Harmondsworth, U.K.: Penguin, 1987.

Lux, Maureen K. Medicine that Walks: Disease, Medicine, and Canadian Plains Native People, 1880–1940. Toronto, Canada: University of Toronto Press, 2001.

Science and Technology

views updated May 23 2018

SCIENCE AND TECHNOLOGY

Scholars of science and technology increasingly recognize the mutual influence between science and technology. Scientific understanding is often a prerequisite for technological advance. Technology in turn provides important inputs to science, including, most obviously, scientific instruments, but also research questions about why certain technologies work or do not work. Sometimes the links between science and technology are temporally close: In the development of nylon, radios, and airplanes, for example, scientific and technological advances were mutually reinforcing. In other cases science answers technological questions that have been around for decades, or science sets the stage for new products and processes that are not yet imagined.

TECHNOLOGY AND THE GREAT DEPRESSION

While economic analyses of the causes of the Great Depression have focused on a handful of aggregate economic variables, such as the money supply, there has always been a minority tradition that has argued that technology was largely responsible for the Great Depression. Two broad types of technological innovation can be distinguished. Product innovation involves the development of a new or improved product. Process innovation involves the development of lower-cost ways of producing an existing product. The line between these is generally clear, but it can be blurred, as when the same innovation both reduces the cost and increases the quality of a particular product.

The course of technological innovation during the interwar period was highly unusual. In terms of product innovation, the decade between 1925 and 1935 is by far the worst in the entire twentieth century. The electric refrigerator was the only major new product. In terms of process innovation, however, there was rapid growth in worker productivity, due primarily to key technological advances. Worker productivity in industry grew by at least 50 percent in the 1920s and another 25 percent in the 1930s.

The development of new products will generally encourage investment, consumption, and employment. Factories and machines will be constructed to build the new products, and workers hired for this purpose. Consumers in turn will be encouraged to spend a greater proportion of their income in order to obtain the new product—though they may decrease purchases of existing goods that serve similar purposes (as the advent of radio and talking movies during the 1920s served to destroy vaudeville and decrease attendance at live theatre in the 1930s). Process innovation means that fewer workers, buildings, and machines will be needed to produce the same quantity of goods, though some initial investment may be required. Lower prices for existing goods will generally result in lower levels of consumer expenditure. There are exceptions when decreased cost leads to a more than proportional increase in the number of goods purchased.

In the long run, a market economy should be able to adjust to the uneven time path of technological innovation. Many economic models suggest, however, that over a period of a few years product innovation will cause a decrease in unemployment, and process innovation will cause an increase. The unemployment experience of the 1930s was likely exacerbated by market saturation in some of the new consumer durables of the 1920s, notably automobiles, radios, and various appliances: Consumers who had recently bought one did not need another.

CONSEQUENCES OF THE SECOND INDUSTRIAL REVOLUTION

To understand the technological experience of the Great Depression, it is useful to start with the second Industrial Revolution of the late nineteenth century. During the 1870s and 1880s, important innovations occurred in three broad areas: chemicals, internal combustion engines, and the generation, transmission, and use of electricity. With the singular exception of the zipper, all major twentieth century innovations, whether product or process, can be traced to one or more of these developments. All three of these strands of technological innovation would generate new products in the early 1920s and late 1930s; they would also generate important process technologies that would have their major period of adoption during the interwar period. In each case the new products of the late 1930s were much more complex than those of the early 1920s; this may explain the paucity of new products in the decade after 1925.

The chemical industry developed continuous processing in place of the previous practice of producing one batch of chemicals at a time. This process technology was adopted by most factories producing a homogeneous output, whether paint or ketchup or oil, in the interwar period, and resulted in huge cost savings. In terms of new products, the major development of the early 1920s involved the semi-synthetic fiber rayon. The first fully synthetic fiber, nylon, would appear almost two decades later in 1939. Developments in plastics—including urea-formaldehyde, lucite, and vinyl—vitamins, and antihistamines are among other new products that would emerge in the late 1930s. The discovery of penicillin by Alexander Fleming in 1928 had a limited economic impact until methods for increasing the rate of natural production in mold were developed in 1940, and synthetic penicillin in 1944. These advances set the stage for the development of a range of antibiotics. Better photographic film and cameras would make the camera a mass market good in the 1930s as well.

While invented long before World War I, it was only with the development of the assembly line in Henry Ford's River Rouge plant in 1913 that the automobile became a potential mass market good. Sales would take off in the 1920s, and almost half of American families would own an automobile by 1929. The assembly line would be adapted to virtually every assembled good in the United States during the 1920s. While sales, and thus employment, in the automobile sector increased as a result, in other sectors the assembly line soon led to decreases in employment.

The automobile and truck generated cost savings in the distribution of goods. In particular, small local stores were replaced by larger and lower-cost enterprises. In agriculture, tractor use expanded throughout the interwar period, as costs decreased and quality improved: There were 80,000 tractors in 1918; 920,000 in 1930; and 1,545,000 in 1940. As tractor use and power expanded, a host of farm implements were developed.

The airplane also predated World War I, but it would only become a mass produced good, capable of generating significant economic activity in production and use, in the mid-1930s. The Douglas DC-3 of 1936 was the biggest single advance, causing costs per passenger mile to drop to a quarter of their level in 1929.

The bulk of American industry would switch to electric power in the 1920s. By the end of the 1930s the victory of electricity was almost complete. This switch reflected both the drop in the cost of electricity (by 50 percent in the 1920s alone) due to improvements in generation and transmission, and the development of ever-better electric motors. Electrification allowed great improvements in factory layout because machines powered by their own small motor, rather than connected by belts to an external power source, could be situated as needed. Electrification also made it much easier to run machines at different speeds. Many processes previously performed by hand were mechanized because of electrification.

In the home, the early twentieth century witnessed the electrification of simple goods like light-bulbs, toasters, and kettles. In the interwar period, a second stage of innovation, involving complex electronics, became evident. The technology of radio transmission and reception advanced to the point that the first commercial radio station was established by Westinghouse Corporation in Philadelphia in 1920. By 1929 virtually every American household owned at least one. The next major innovation in wireless communication would be the television. After a host of improvements in the 1920s and 1930s, the first regular broadcasts in the United States began in 1939.

The electric refrigerator only became a mass consumer good in 1931 after a series of improvements over the previous decades. Sales expanded until 1937, at which point half of wired homes possessed a refrigerator. The success of the only major new product innovation of the early 1930s suggests that consumption, investment, and employment would all have been higher if other new products had emerged.

While productivity advance continued in the 1930s, this required little investment. The major new process innovation of the 1930s was tungsten carbide cutting tools. These cutting tools could generally be fitted to existing machines, but allowed much greater speed and accuracy. There were also improvements in management techniques. The rolling mill for producing sheet metal was one process innovation that did require significant investment. This only became technically superior to labor-intensive methods in 1930; twenty-eight rolling mills were built during the 1930s.

INDUSTRIAL RESEARCH LABORATORIES

Developed by German chemical firms in the late nineteenth century, the industrial research laboratory was adopted in the United States first in electrical products but eventually across a wide range of industries. Although innovations still often came from isolated tinkerers before the interwar period, after that time most new technologies were developed, at least in part, in formal industrial research settings. These laboratories have played an important scientific role. The invention of nylon depended on research by DuPont for a better understanding of the chemical composition of polymers. The airplane depended on advances in aerodynamic theory (in this case financed largely by the military).

The earliest industrial research laboratories, however, shied away from the expense and risk associated with pursuing projects in basic science. They instead tended to focus on process innovation and minor product innovation. The profits earned by companies pioneering such products as television or nylon in the late 1930s would encourage many industrial research laboratories to pursue major product innovation in the postwar years. If this change had occurred earlier, the technological, and thus economic, experience of the Great Depression might have been quite different.

SCIENTIFIC RESEARCH IN UNIVERSITIES

There were concerns during the 1930s that industrial research laboratories were not only absorbing those who might otherwise have been independent inventors, but attracting some of the best scientists away from universities. Yet industrial research laboratories also had a positive impact on university research: They provided direct funding to some researchers and also funded graduate fellowships.

The major source of funding of scientific research in the 1930s, though, was philanthropic foundations. Of these foundations, those tied to the Rockefeller or Carnegie names, and particularly the Rockefeller Foundation, provided some 90 percent of the funding. These foundations had, earlier in the century, tried to develop research facilities independent of universities. As the century unfolded universities came to see research as a key part of their mission. While professors now had the time and incentive to perform research, they also needed direct funding of research expenses. The foundations in the 1920s had provided funding to certain departments at key universities. In the 1930s the foundations moved toward supporting the research of individual professors. Determined to maximize the return on their funding, they insisted on evidence of publications before funding was renewed. This likely encouraged scientific effort, but raised concerns about scientific independence. Funding decisions were based on individual contacts between professors and foundation officers. Total foundation expenditures on research stagnated during the Depression; medical research was emphasized at the expense of basic science, and thus, for example, theoretical physicists seeking funding for particle accelerators spoke to the biological understanding that might result from these.

Governments, especially in the areas of military, health, and agricultural research, provided some limited funding in the 1930s. During World War II, expenditure on research would triple, with governments funding the increase. Postwar government funding of university research would soon eclipse foundation funding; governments would rely on a more bureaucratic process of official grant applications, and review by other experts in the area. Due largely to foundation encouragement the United States had become the most important country in the world of science by 1930; movement of refugee scientists to the United States would enhance this dominance over the next decade.

Science was not only primarily identified with universities by the 1930s, but also with distinct disciplines such as physics and chemistry. Nevertheless, both foundations and scientists appreciated the value of interdisciplinary interaction. Many of the major scientific discoveries of the Depression era reflect cross-disciplinary communication. While physics loomed large in these conversations, advances in, for example, quantum chemistry, did not just involve the application of quantum theory, but the convergence of quantum theory with theoretical and empirical trends within chemistry itself. Scientific advances during this period also reflected the development of a host of new scientific instruments, such as the particle accelerator, electron microscope, and ultracentrifuge.

The discovery of the neutron in 1932 allowed physicists for the first time to understand the stability of nuclear structure in terms of quantum theory. The positron was also discovered in 1932, and physicists began to enumerate the various forces that operate between different particles. By the end of the decade, scientists in both Germany and the United States had achieved nuclear fission.

Improved instruments for studying distant parts of the universe, in combination with the theory of general relativity, led to widespread consensus among astronomers in the 1930s that the universe was expanding, though there was little consensus on how the process might have begun. At the same time, advances in nuclear physics allowed a theoretical understanding of how stars could generate energy over billions of years.

By understanding the internal working of molecules chemists were better able to predict and control chemical reactions. Polymer science in particular advanced rapidly. Only in the early 1930s had chemists come to accept the existence of large complex polymer molecules. An understanding of the internal working of molecules was also useful in the study of living organisms: Major advances occurred in the analysis of proteins that would set the stage for the postwar discovery of DNA.

The 1930s was also the period of the "modern synthesis" in biology. Theoretical and empirical discoveries in the area of genetics were shown to be consistent with "natural history." That is, the shortterm genetic changes observed in the laboratory could be understood as generating the longer-term changes posited by evolutionary theories.

See Also: FORD, HENRY; RADIO.

BIBLIOGRAPHY

Beaudreau, Bernard C. Mass Production, the Stock MarketCrash, and the Great Depression: The Macroeconomics of Electrification. 1996.

Bernstein, Michael A. The Great Depression: Delayed Recovery and Economic Change in America, 1929–1939. 1987.

Brock, William H. The Norton History of Chemistry. 1992.

Cross, Gary, and Rick Szostak. Technology and AmericanSociety: A History. 1995.

Freeman, Christopher, and Francisco Louca. As TimeGoes By: From the Industrial Revolutions to the Information Revolution. 2001.

Fruton, Joseph S. Proteins, Enzymes, Genes: The Interplay of Chemistry and Biology. 1999.

Kohler, Robert E. Partners in Science: Foundations and Natural Scientists, 1900–1945. 1991.

Kragh, Helge. Quantum Generations: A History of Physics in the Twentieth Century. 1999.

Krige, John, and Dominique Pestre, eds. Science in theTwentieth Century. 1997.

North, John. The Norton History of Astronomy and Cosmology. 1995.

Nye, Mary Jo, ed. The Cambridge History of Science, Vol. 5: The Modern Physical and Mathematical Sciences. 2003.

Reich, Leonard S. The Making of American Industrial Research: Science and Business at GE and Bell, 1876–1926. 1985.

Schumpeter, Joseph A. Business Cycles: A Theoretical, Historical, and Statistical Analysis of the Capitalist Process. 1939.

Szostak, Rick. Technological Innovation and the Great Depression. 1995.

Rick Szostak

Science and Technology

views updated May 14 2018

SCIENCE AND TECHNOLOGY

With the notable exception of Israel, science and technology in the Middle East is at an embryonic stage, especially when compared to the West. Whether and how it develops will depend largely on politics and economics in each country and in the area.

The science and technology systems in most Middle Eastern countries are, with two exceptions, similar to those in other developing countries. Israel, whose system is akin to that of industrial countries, is the major exception. The other is Afghanistan, which has not yet established a scientific infrastructure.

Most Middle Eastern countries are primarily interested in applying science and technology for development. Some have sought to acquire capabilities in defense technologies but have been only partially successful. Israel alone has succeeded in applying technology for developmental and military purposes.

With the exception of Israel, information on professional manpower and science-related institutions in all countries is limited.


Manpower Development

Governments of the region have long recognized the importance of professional manpower to national development and have consequently devoted considerable efforts and resources to the provision of higher education. During the early 1950s, most countries except for Egypt and Israel suffered from shortages of professional manpower. These shortages have today been overcome everywhere in the region except Afghanistan.

Substantial numbers of engineers and scientists are now available. The Arab countries are in the lead, with a total of some 600,000 engineers. The figures on research and development (R&D) scientific manpower, though incomplete and fragmentary, are as follows: Egypt (1986), 21,000; Iran (1985), 3,200; Israel (1984), 20,000; Turkey (1985), 11,300. These countries also had a substantial number of university professors: Egypt (1988), 33,000; Algeria (1988), 14,000; Morocco (1989), 7,000; Iraq (1986), 4,600; Saudi Arabia (1988), 10,000; Syria (1986), 5,000; Iran (1988), 14,000; Turkey (1989), 31,000.

Graduate level education and postdoctoral specialization in the basic and applied sciences are still dependent on foreign study.

Despite large numbers of scientists and engineers, the science and technology systems in most countries suffer from a lack of articulation: higher education is not integrated with demand. Moreover, continuing and distance education is still underdeveloped. Consequently, there is an inability to adapt and upgrade manpower skills in an efficient and cost-effective manner.

Israel, by contrast, depends heavily on educated immigrants. Its universities are of high quality, and effective systems of continuing and distance education have been introduced.


Research & Development

R&D in Israel is at the same level as those of leading industrial countries. It publishes about 10,000 papers a year in refereed journals surveyed by the Institute of Scientific Information (ISI) in Philadelphia. Its per capita publication output compares favorably with that of the United States, and the profile of its publications is similar to that of other industrial countries.

Israeli researchers circulate in and receive funding and support from European and American research establishments. A considerable proportion of Israeli R&D is directed toward weapons systems; but Israel also has strong research programs in most scientific and technological fields of relevance to its economy. It devotes about 3 percent of its gross national product (GNP) to R&D, and currently has about 50,000 research scientists. Its heavy emphasis on military technology is, however, causing serious economic problems as a result of the current collapse of the world demand for weaponry.

The scientific output of the Arab countries can be compared favorably with that of Brazil and India, the leading developing countries. During the 1980s, the number of scientific publications per million inhabitants was 18 (Brazil), 16 (India), and 15 (the Arab world). The per capita output of the Arab countries is some 2 percent that of industrial countries. In 1990, there were more than 5,000 publications from 700 Arab institutions. Half of these were from 12 institutions, 11 of which were universities. Other institutions involved in publishing were hospitals and agricultural research stations.

R&D in the Arab countries is overwhelmingly of an applied nature. Thirty-eight percent of publications are in medicine; 20 percent in agriculture; 17 percent in engineering; 17 percent in the basic sciences; and 8 percent in economics and management. Even work that is classified as basic science is often of an applied nature. The three leading countries in order of research output are Egypt (37 percent), Saudi Arabia (20 percent), and Kuwait (12 percent). In 1990, Kuwaiti output had started to approach that of European countries.

Publications from Iran and Turkey are on a more limited scale; their output in 1990 was 161 and 1,300, respectively. The number of publishing institutions was 80 (Iran) and 155 (Turkey).

The profile of publications from Iran and Turkey, like that in the Arab countries, emphasizes traditional and applied fields such as medicine and agriculture; the proportion of publications in the basic sciences, molecular biology, information sciences, and other advanced areas is far below international levels.

The exact funding of R&D in the Arab world, Iran, and Turkey is not accurately known; it is estimated, however, to be below 1.0 percent (probably closer to 0.5 percent) of GNP throughout the region.


Institutional Framework

The capacity to apply science and technology is dependent on the prevailing institutional framework rather than on the actual number of professionals. Most of the countries have some form of institution to manage science and technology: ministries of science and technology or directorates, attached to the ministry of higher education, of planning, or to the prime minister, which are responsible for different aspects of science and technology.

But the pervasive nature of science and technology is still not recognized, and these institutions are generally bureaucratic and inflexible; they tend to regard science and technology as being restricted to R&D and manpower.

Once again, Israel is the exception; it has established an effective and comprehensive system of science policy planning and management.


The Application of Science and Technology

Some of the instruments through which science and technology are developed and applied are: consulting and contracting organizations, agricultural research stations, extension programs, hospitals, industrial firms, testing laboratories, information services, and others.

Most countries have organizations to provide these services that vary in competence and efficiency. A brief description follows of two strategic types of organizations.

Consulting organizations are critical instruments for planning and designing new projects and for adapting and transferring technology. A substantial number of state-run and private consulting firms have been established throughout the region. In fact, one of the largest international consulting firms in developing countries is Lebanese (Dar al-Hanadasa [Shair & Partners]). Large public-sector consulting firms are found in most countries of the region.

Consulting firms are heavily oriented toward civil engineering technologies, with the result that the region is still dependent on the importation of consulting services in industrial technology.

Contracting organizations bring together ideas, plans, materials, equipment, labor, and financing to produce the desired products within an agreed schedule and cost. The largest contracting firms in the region are in Turkey, whose government has provided them with the necessary financial, risk cover, and diplomatic support.

There are around 100,000 Arab contracting firms, but the Arab countries still depend on foreign firms for 50 percent of their requirements. This is largely due to the absence of appropriate public policies. The leading Arab contracting companies are privately owned and based in Lebanon and Saudi Arabia.


National Science Policies

Israel is the only country in the region with the capacity to design and implement science and technology policies. In the rest of the region, national, regional, and international organizations have sought to promote the development of capabilities in science policy formation, but the results have been limited. This is due to the prevalence of preindustrial political cultures, which have made science policy formation difficult, if not impossible.

As of the mid-1990s there were increasing indications that Turkey would soon acquire an industrial political economy. When it does so, it will be capable of formulating and implementing science policies.


The colonial legacy of the region has led to the virtual elimination of intersectoral linkages and has resulted in the vertical integration of the components of a fragmented economy into foreign sources of technology. This situation has prevented the acquisition and accumulation of technological experiences, which in turn has reduced the chances of a transition to an industrial political economy.


The combination of underused capabilities and unexpected developments could lead the way to technology change. For example, the heavy bombing of Iraq, coupled with the stringent economic blockade, has forced the mobilization of Iraq's considerable capabilities in science and technology, which had previously been marginalized. A massive reconstruction of the country has consequently taken place. The same example applied to Iran during the 1980s.


Different countries in the region may discover how to mobilize their considerable professional scientific and technological manpower after other alternatives are no longer available. These challenges could induce changes in the political culture, which in turn could result in new attitudes toward science and technology.


Bibliography


ALESCO. Strategy for the Development of Science and Technology in the Arab World. Tunis, 1987. English version available.

Institute of Scientific Information. Science Citation Index. Philadelphia: Author, 19701991 (monthly).

Organisation for Economic Co-operation and Development. Main Science and Technology Indicators. Paris: Author, 1992 (biannual).

UNESCO. UNESCO's Yearbook. Paris, 19701991 (annual).

Zahlan, A. B. The Arab Construction Industry: Acquiring Technological Capacity. Basingstoke, U.K.: Macmillan, 1991.

Zahlan, A. B. Science and Science Policy in the Arab World. London: Croom Helm, 1980.

antoine benjamin zahlan

Science and Technology

views updated Jun 11 2018

Science and Technology

Source

A Useful Art. In 1835 Alexis de Tocqueville wrote in Democracy in America that the social conditions and institutions of democracy prepare [Americans] to seek immediate and useful practical results of the sciences. The desire of scientists to promote their profession as a useful art rather than a purely theoretical discipline led to efforts to join the forces of scientific and technological innovation. Technological innovation had remained largely in the hands of artisans and mechanics, but the increasing sophistication of machinery and the demands of an industrializing society made cooperation with scientists both necessary and desirable. One important example of scientific influence on technological development was the research of Princeton scientist Joseph Henry into the principles of electromagnetism. The principles that Henry detailed in his work made it possible for Samuel F. B. Morse to develop an effective telegraph in 1837 and also proved fundamental to the development of electrical motors later in the century.

The Franklin Institute. By 1824 the democratic spirit of the age, together with the desire to promote concerted efforts between scientists and mechanical innovators, led to the founding of the Franklin Institute in Philadelphia. Designed to bring together well-educated gentlemen scientists and working-class artisans and mechanics, the institute began as an educational enterprise. In addition to providing a scientific curriculum for the workers and holding annual exhibitions of their inventions, the institute published a journal designed to communicate useful scientific information in a manner that skilled but relatively uneducated workers could comprehend. The institute quickly attained a sound financial footing, a substantial membership, and a strong circulation for its journal.

Engineering Profession. The Franklin Institute did not, however, succeed in bridging the class differences between artisans and mechanics on the one hand and scientists on the other. Instead it contributed to the rise of the engineer, a new kind of professional who combined both scientific education and mechanical skill. The Franklin Institute was the most famous of similar institutions established in the 1820s and 1830s that led to the establishment of the engineering profession. For example, in 1824 (the same year that the Franklin Institute was founded) Stephen Van Rensselaer established a school in upstate New York for the purpose of instructing persons .. . in the application of science to the common purposes of life. That school would eventually become Rensselaer Polytechnic Institute. Engineers had already begun to make their presence felt by the early 1830s, having contributed to the design and construction of an elaborate system of canals in the Northeast, of which the Erie Canal, opened in 1825, was the most famous. Engineers would go on to make significant contributions to the application of steam power to printing and manufacturing, the improvement of locomotive technology, and the construction of roads and bridges.

Source

David Freeman Hawke, Nuts and Bolts of the Past: A History of American Technology, 1776-1860 (New York: Harper ScRow, 1988).

About this article

Science and Technology

All Sources -
Updated Aug 18 2018 About encyclopedia.com content Print Topic

NEARBY TERMS

Science and Technology