The 1990s Science and Technology: Topics in the News

views updated

The 1990s Science and Technology: Topics in the News

HUMAN GENOME PROJECT
THE INTERNET
THE ENVIRONMENT: FROM GLOBAL WARMING TO THE EARTH SUMMIT
A WIRELESS WORLD
NASA: PROBE FAILURES, HUBBLE, AND THE ISS

HUMAN GENOME PROJECT

The Human Genome Project (HGP), officially launched on October 1, 1990, is a multibillion-dollar, international scientific research effort to map all of the estimated fifty to one hundred thousand genes on the twenty-three pairs of human chromosomes and read their entire sequence or arranged order. In the center of any normal human cell, there are forty-six X-shaped chromosomes (twenty-three pairs), and within each chromosome is bundled a long, coiled molecule called deoxyribonucleic acid (DNA). Each DNA molecule contains identifiable subunits known as genes. Each gene carries genetic instructions for everything from hair color and height to how the brain is organized. All of the genes together are called the genome. Thus mapping or decoding all of the genes in the human genome would give scientists unprecedented understanding of the human body and could point to the eventual diagnosis, cure, or elimination of many genetic diseases or disorders.

The HGP is a mammoth, thirteen-year federal project involving scientists from at least eighteen countries. In 1998, a private company, Celera Genomics, entered into the research. Francis S. Collins, director of the National Human Genome Research Institute (NHGRI) at the National Institutes of Health (NIH), asserted that this research is the most important organized scientific effort that humankind has ever attempted. By 1999, one-quarter of the human genome code had been spelled out by teams of government-sponsored scientists and by their corporate competitors. Computer technology played an important role in making genetic research possible. It provided the communication and organizational medium to manage the genetic information that was discovered, and provided the tools to create machines that made it easier to sequence genes.

In December 1999, an international team announced it had achieved a scientific milestone by compiling nearly the entire code of a human chromosome for the first time. Researchers chose chromosome twenty-two because of its relatively small size (just over thirty-three million pieces or chemical components) and its link to many major diseases. The sequence they compiled is over twenty-three million letters in length and is the longest continuous stretch of DNA ever deciphered and assembled. Researchers were able to find only 97 percent of the genetic material, but the results were considered complete for the time. More than thirty human disorders already were associated with changes to the genes of this chromosome, including a form of leukemia, disorders of fetal development and the nervous system, and schizophrenia. Scientists expected the decoding of the rest of the genome to come quickly.

Private companies that were working to map the human genome hoped to beat the HGP to the finish line in order to win lucrative patents on new genetic discoveries. J. Craig Venter, president of Celera Genomics Corporation, declared in 1998 that his company would have a map of the entire human genome ready in 2001, years ahead of the original HGP estimated date of completion of 2005. His announcement forced the leaders of the HGP to move up their deadline to 2003 for a finished product and 2001 for a "working draft."

Critics of Venter and other private researchers argued that racing to decode the entire human genome would make for sloppy and incomplete results, but private companies asserted that the painstakingly precise research done at the HGP was slow and unnecessary. They argued that mapping all of the genes in encyclopedic detail delayed scientists from finding and concentrating on the important genes that could be analyzed to help prevent or cure genetic diseases. Details of a rough map, they believed, could be filled in later.

Scientists at the HGP predicted that Venter's genome map would be full of holes. In addition, many felt that allowing patents on human genes was unethical, since no one should "own" the human genetic code. They also worried that his financial backers would file patents (exclusive rights to market inventions), limiting access to the information and thus blocking the advancement of science. Patents on genetic information could deter scientists from doing important research since only the company or government institution that holds a patent can profit from new discoveries that pertained to the patented information.

The promise of these lucrative rights and scientific prestige drove researchers in the private and public sector to hasten their efforts toward a finished map of the human genome. Only after Venter met with Collins did the two sides agree that cooperation would achieve more than competition. So on June 26, 2000, Celera and HGP jointly announced the completion of a rough draft of the human genome, having put together a sequence of about 90 percent.

Total project completion—closing all of the remaining gaps and improving the map's accuracy—is expected in 2003. All researchers agree that once the gene sequence is completed, the next step will be to look into how genes vary from one person to the next, in effect, decoding the genetic basis of human individuality.

THE INTERNET

The revolutionary technology of the Internet and the World Wide Web created a whole new digital culture in America during the 1990s. The idea of an "Information Superhighway" that could link anyone in the world through nearly instantaneous data transmission became a reality. Terms such as "cyberspace" and "the Net" became part of everyday speech. The introduction of the Internet into mainstream American society changed the ways in which business was conducted, information was exchanged, and social interactions were carried out.

Joseph C. R. Licklider, a psychologist at Massachusetts Institute of Technology (M.I.T.), first conceived the idea of an Internet, or an interconnected computer network, in 1962. He envisioned a globally interconnected set of computers through which anyone with a computer terminal could quickly access data and programs from another computer. That year, Licklider became the first head of the computer research program at the Advanced Research Projects Agency (ARPA), a bureau of the U.S. Department of Defense. Created in 1958 by President Dwight D. Eisenhower (1890–1969), ARPA was the first U.S. response to the Soviet launching of the unmanned satellite Sputnik 1 in October 1957. This marked the beginning of the space race between the United States and the former Soviet Union.

The mission of ARPA was to ensure that the United States maintained a lead in applying state-of-the-art technology for military purposes and to prevent technological surprises from an enemy. In 1969, scientists at ARPA created the first-ever computer network, known as the ARPANET, which linked a total of four university and military computers. Within a year, twenty-three host computers were linked.

The major characteristic of ARPANET, and one of the key innovations that made the Internet possible, was the way it used the new idea called "packet switching." Data, or information to be transmitted from one computer to another, were divided into pieces or "packets" of equal-size message units, then launched into the network. Each packet was like a postcard carrying some part of the message. The packets found their way through the network along any paths that were open to them, moving from node to node along the network. A node was like a post office that sent the postcards along toward the recipient. Each node kept copies of the packets and continued to send them out until the packets successfully reached the next node. When all of the packets reached the final destination, they were reassembled and the complete message was delivered to the recipient. For defense purposes, this system seemed ideal since if there were any working path to the final destination, no matter how indirect, the new network would find it and use it to get the message through.

As this system slowly grew, it became apparent that eventually the computers at each different location would need to follow the same rules and procedures if they were to communicate with one another. In fact, if they all went their separate ways and spoke a different "language" and operated under different instructions, then they could never really be linked together in any meaningful way. More and more, the scientists, engineers, librarians, and computer experts who were then using ARPANET found that the network was both highly complex and very difficult to use. As early as 1972, users were beginning to form a sort of bulletin board for what we now call E-mail (electronic mail). This made the need for common procedures even more obvious, and in 1974, what came to be called a common protocol was finally developed. Protocols are sets of rules that standardize how something is done so that everyone knows what to do and what to expect. This common language was known as Transmission Control Protocol/Internet Protocol (TCP/IP).

The development of this protocol proved to be a crucial step in the development of a working network since it established certain rules or procedures that eventually would allow the network to expand. One of the keys of the protocol was that it was designed with what was called "open architecture." This meant that each small network could be separately designed and developed on its own and not have to modify itself in any way in order to be part of the overall network. This would be taken care of by a "gateway" (usually a larger computer) that each network would have whose special software linked it to the outside world. In order to make sure that data were transmitted quickly, the gateway software was designed so that it would not hold on to any of the data that passed through it. This not only sped things up, but it also removed any possibility of censorship or central control. There was no board of directors that dictated how the Internet could be used or what information could be passed along it. Finally, data would always follow the fastest available route, and all networks were allowed to participate. The Internet belonged to everyone.

The mid-1980s marked a boom in the personal computer industries. Inexpensive desktop machines and powerful network-ready servers allowed many companies to join the Internet for the first time. Corporations began to use the Internet to communicate with each other and with their customers. Throughout the 1980s, more and more small networks were linked to the Internet, including those of the National Science Foundation, the National Aeronautics and Space Administration, the National Institutes of Health, and many foreign, educational, and commercial networks. The nodes in the growing Internet were divided up into Internet "domains," known as "mil," "org," "com," "gov," "edu," and "net." "Gov," "mil," and "edu" denoted government, military, and educational institutions, which were the pioneers of the Internet. "Com" stood for commercial institutions that were soon joining in, along with nonprofit organizations, or "orgs." "Net" computers served as gateways between networks.

In 1990, ARPANET was decommissioned, leaving the vast network of networks called the Internet. Although growing, the Internet at this time was no place for a beginner. The main problem was that every time users wanted to do something different on it (such as E-mail or file transfer), they had to know how to operate an entirely separate program. Commands had to be either memorized or reference manuals had to be consulted constantly. The Internet was not "user-friendly."

The First All-Computer-Animated Movie

In 1995, Walt Disney Pictures released the huge hit film Toy Story, the first full-length animated feature to be created entirely by artists using computer tools and technology instead of drawing scenes by hand. The completely computer-generated movie took four years to make, lasted seventy-seven minutes, and contained 1,561 computer-generated images. To create the movie, Disney teamed up with Pixar Animation Studios, a pioneer in computer graphics and the first digital animation studio in the world. Using their own proprietary software, with computers as their tools, the moviemakers introduced a three-dimensional animation look, with qualities of texture, color, vibrant lighting, and details never before seen in traditional animated features.

The development of what came to be called the World Wide Web in 1991 marked the real breakthrough of the Internet to a mass audience of users. The World Wide Web is a software package that was based on "hypertext." In hypertext, links are "embedded" in the text (meaning that

certain key words are either underlined or appear in a contrasting different color). The user can click on these links with a mouse to be taken to another site containing more information. The World Wide Web made the Internet simple to understand and enabled even new users to be able to explore or "surf" the Net.

MP3 Rocks the Music World

The Moving Picture Experts' Group, Audio Layer III (MP3), is a computer format that compresses tens of megabytes (MB) of an audio file into just a few megabytes. A standard MP3 compression is at a 10:1 ratio and yields a file that is about 4 MB for a three-minute audio track. MP3 technology started in the mid-1980s, but the format did not take off until the first MP3 player was developed in 1997. The following year, when a free MP3 music player software program was offered on the Internet, high-quality digital audio files became easily downloadable, and MP3 became the most popular trend in consumer audio. Soon the Internet was full of Web sites offering players and files. Chief among these was Napster, which offered Internet users just about any type of music they wanted for free.

Music industry executives quickly became concerned about MP3, arguing that it was being used to steal intellectual property. Many Web sites offered songs without first obtaining copyright permission, posing a major threat to record labels and performers. Record companies launched efforts in 1998 and 1999 to bring under control what they saw as bootlegging. Some recording artists, however, felt that MP3 might be a good thing, introducing a new way to bring their music to the public. Many musicians saw the MP3 technology as a way to sidestep the powerful music publishing business by using the Internet to distribute their songs. In 1998, record companies sued the makers of a portable MP3 player in an effort to keep the player off the market, but they lost the case. Undaunted, the music industry continued its fight against MP3-related businesses into the twenty-first century.

In 1993, the addition of the program called Mosaic proved to be the final breakthrough in terms of the Internet's ease-of-use. Before Mosaic, the Web was limited to only text or words. However, as a "graphical browser," the Mosaic program included multimedia links, meaning that a user could now view pictures, hear audio transmissions, and even see video transmissions. Before Mosaic, there were only fifty Web pages on the Internet. After its release, the World Wide Web grew by an astronomical 341,000 percent. By 1999, there were more than eleven million domain names registered on the Web and more than seventy million Web sites. In thirty years, the Internet grew from a military concept for communicating after a nuclear war to an Information Superhighway that ushered in a social and economic revolution.

THE ENVIRONMENT: FROM GLOBAL WARMING TO THE EARTH SUMMIT

During the 1990s, global warming became a major concern for scientists and the public. Many scientists warned that carbon dioxide and other gases released from the burning of fossil fuels (coal, oil, and natural gas) were collecting in the atmosphere and acting like the glass walls of a greenhouse, trapping heat on the surface of Earth (a phenomenon known as the greenhouse effect). They predicted that average atmospheric temperatures could rise as much as 6.3 degrees Fahrenheit over the next century. If this occurred, the polar ice caps would melt, threatening coastal areas with flooding and causing massive climate changes throughout the world.

They cited evidence during the decade of increasing heat waves, melting polar ice, and rising sea levels—all thought to be caused by global warming. In Antarctica, Adelie penguin populations declined 33 percent over a twenty-five year period because the sea ice where they lived was shrinking. In Bermuda and Hawaii, rising seas killed coastal mangrove forests and caused beach erosion. In the late 1990s, scientists studying Arctic ice discovered that the polar ice was less than half as thick as ice measured in the same area in earlier years, down from between six and nine feet to less than four feet thick. They also found that the water below the ice contained far less salt than normal, indicating that the ice was melting at an alarming rate, flooding the sea with fresh water. Scientists had known for some time that the Arctic ice cap was shrinking, especially since 1990, but did not expect the changes to be as great as they appeared.

Skeptics questioned the presumed connection between human activity and global warming, arguing that while the global temperature might be rising, it could be the result of normal changes in weather patterns. They pointed out that Earth had undergone several major climate shifts throughout its known history and suggested that these normal shifts, not the burning of fossil fuels, were responsible for the changes in global-weather patterns.

Scientific experiments and research, as well as international conferences throughout the decade, addressed growing concerns about global warming. Although most climatologists (scientists who study changes in Earth's atmosphere) accepted the theory that the burning of fossil fuels and the subsequent rise in carbon dioxide levels in the atmosphere was causing the planet to grow warmer, there was no agreement on how global warming might be reversed. America and the industrialized world had, since the nineteenth century, become too dependent on coal, oil, and natural gas to change its ways easily.

The Birth of the DVD

DVD-Video (digital versatile disc or digital video disc) is a high-capacity multimedia data-storage medium designed to accommodate a complete movie on a single disc, as well as rich multimedia and high-quality audio. DVD technology originated in 1994 as two competing formats, Super Disc (SD) and Multimedia CD (MMCD). In 1995, developers agreed on a single format called DVD, and in 1997 it became publicly available in the United States, quickly becoming the most popular electronics consumer item to date.

DVD-Video (often simply called DVD), the first widely used application in the country, was embraced by the movie industry, which wanted a disc, like a compact disc (CD), capable of holding a high-quality recording of a full-length feature with surround-sound audio. The disc was played on a DVD player hooked up to a standard television set, much like the older videocassette recorder (VCR). By the end of the 1990s, DVD-ROM (read-only memory), the format for delivering data and multimedia content that could be played by computers equipped with DVD-ROM drives, was fore-cast to grow even faster than DVD-Video. With its capacity to hold the increasingly complex multimedia applications being developed, DVD-ROM was used widely in the computer industry and for new video games with better and more realistic video content. The DVD-Audio format, designed to provide the highest possible audio fidelity capable, far exceeding the quality of conventional CDs, was introduced in 1999.

A totally enclosed greenhouse, Biosphere 2, was designed to mimic conditions on Earth (Biosphere 1) in a sealed, controlled environment. One of the most spectacular structures ever built, it is located in the Sonoran Desert at the foot of the Santa Catalina Mountains not far from Tucson, Arizona. It is the world's largest greenhouse, made of tubular steel and glass, covering an area of three football fields and rising to a height of eighty-five feet above the desert floor. Within the structure, there is a human habitat with a farm for the biospherians, or inhabitants, to work to provide their own food. There are also five other wild habitats or biomes representing a savannah, a rain forest, a marsh, a desert, and an ocean. Biosphere 2 is completely sealed so no air or moisture can flow in or out. Nearby are two balloon-like structures that operate like a pair of lungs for Biosphere 2 by maintaining air pressure inside. Only sunlight and electricity are provided from outside.

On September 26, 1991, four women and four men from three different countries entered the Biosphere 2 and the doors were sealed for the two-year-long initial program of survival and experimentation. During this time, the biospherians attempted to run the farm and grow their own food in the company of some pigs and goats and many chickens. They shared the other biomes with over four thousand species of animals and plants that were native to those habitats. The resident scientists observed the interactions of plants and animals, their reactions to change, and their unique methods of living. The biospherians also had the assignment of experimenting with new methods of cleaning air and water.

On September 26, 1993, the biospherians emerged from Biosphere 2. It had been the longest period on record that humans had lived in an "isolated confined environment." Unfortunately, the experiment did not live up to expectations. Oxygen levels inside the complex dropped so low that supplemental oxygen had to be added to protect the lives of the eight biospherians—violating the idea of total isolation. An unusually cloudy year in the Arizona desert stunted food production in Biosphere 2, because the plants needed sunlight, and some reports suggested that scientists smuggled in extra food. Nearly all of the birds, animals, and insects that were brought into the environment and expected to thrive there died instead, though ants and cockroaches ran rampant. In 1996, Columbia University took over operation of the facility, using it as a teaching and research tool for environmental science education and research.

In 1992, representatives from more than 172 nations met in Rio de Janeiro, Brazil, for the first United Nations Conference on Environment and Development (UNCED), or International Earth Summit. This meeting was held to address problems of environmental protection and how worldwide economic development can be achieved without sacrificing the environment. The assembled leaders discussed global issues ranging from an increase in population to global warming to protecting the world's plant and animal species. The Earth Summit met with mixed success. Critics charged that the most advanced countries, including the United States, were trying to regulate the development of poorer countries without improving their own environmental performance.

In 1997, the United Nations issued a five-year review of the progress of Earth Summit agreements. One of the findings of the review indicated that global water supplies could be in danger. The supply of fresh, clean water, already threatened by growing levels of pollution, was found to be growing so scarce in some areas that two-thirds of humanity could suffer moderate to severe water shortages by 2030.

A WIRELESS WORLD

Mobile, or wireless, communications became a significant part of American life during the 1990s. At the end of the decade, a poll showed that the mobile or cellular phone was the most important new personal communications device for many Americans. Most respondents cited cell phones as the technology that they used most in their daily lives, more than a computer, E-mail, or the Internet. Cellular telephones became ever smaller and more portable, and could be carried inconspicuously. Lower costs and greater convenience gained millions of new customers for mobile-phone technology every year. By 1995, there were approximately eighty-five million users of cellular telephony worldwide, and thirty-two million of them were Americans. As of June 1999, there were more than seventy-six million wireless communications subscribers in the United States, with 38 percent of these using digital wireless technology.

The term cellular comes from the design of the system that carries mobile phone calls from geographical service areas that are divided into smaller pockets, called cells. Each cell contains a base station that accepts and transfers the calls from mobile phones that are based in its cell. The cells are interconnected by a central controller, which is called the mobile telecommunications switching office (MTSO). The MTSO connects the cellular system to the conventional telephone network, and it also records call information so that the system's users can be charged appropriate fees. In addition, the MTSO enables the signal strength to be examined every few seconds (automatically by computer) and then be switched to a stronger cell if necessary. The user does not notice the "handoff" from one cell to another.

Traditional cellular technology uses analog service. This type of service transmits calls in one continuous stream of information between the mobile phone and the base station on the same frequency. Analog technology modulates (varies) radio signals so that they can carry information such as the human voice. The major drawback to using analog service is the limitation on the number of channels that can be used.

Digital technology, on the other hand, uses a simple binary code to represent any signal as a sequence of ones and zeros. The smallest unit of information for the digital transmission system is called a bit, which is either one or zero. Digital technology encodes the user's voice into a bit stream. By breaking down the information into these small units, digital technology allows data to be transmitted faster and in a more secure form than analog.

Scientists at Bell Laboratories had invented cellular technology in the late 1940s, but the growth of wireless communications did not begin until

Y2K

One of the biggest concerns during the last year of the 1990s was the Y2K bug (Y2K was the nickname for the year 2000: Y for year and 2 times K, the standard symbol for one thousand). The so-called Y2K bug was a fault built into computer software because early developers of computer programs were uncertain that computers would even have a future. To save on memory and storage, these developers abbreviated standardized dates with two digits each for the day, month, and year. For instance, October 14, 1966, was read as 101466. However, this short form could also mean October 14, 1066, or October 14, 2066.

As early as 1997, it had been officially determined that many computers, from those operating in agencies of the federal government to those found in individual homes, might not recognize the year 2000. Experts feared widespread simultaneous crashing of systems everywhere from Automatic Teller Machines (ATMs) to electrical grids and hospital equipment to the possible accidental detonation of atomic weapons. To ease the public's concerns, businesses and government agencies issued "Y2K ready" statements, meaning their computer had been reprogrammed to handle the date change.

For some people, though, the Y2K problem signaled nothing less than the possible end of the world. All over the country, people stocked up on power generators, dried and canned food, bottled water, guns, and ammunition in preparation for a possible New Year's Day that would begin in chaos. Y2K fears spawned a host of associated products and businesses from survival videos to advisory Web sites to agencies that, for a fee, reserved places in Y2K-safe communities. Most Americans, however, took the whole thing in stride. As early as January 1999, almost 40 percent of people surveyed believed that the Y2K problem was not something about which to be terribly concerned. In the end, it wasn't.

1983, when the Federal Communications Commission (FCC, government department in charge of regulating anything that goes out over the airwaves, such as telephone, television, and radio) authorized commercial cellular service in the United States. Bell Laboratory's Advance Mobile Phone System (AMPS), the first commercial analog cellular service, became a reality, and mobile phones began to be used throughout North America, mainly by business executives in their automobiles. During the 1980s and early 1990s, regions of the world such as Western Europe and Latin America also began to develop wireless communications. By the end of the decade, mobile phone use was widespread among the general population.

With the rise of cellular use, however, American society was confronted with a host of new problems resulting from noisy phones and beepers going off everywhere from churches to classrooms to movie theaters to restaurants. Perhaps the biggest problem debated was the use of cellular phones while driving. According to a study published in the New England Journal of Medicine, people talking on a cellular phone while driving were four times more likely to be in an accident than the average driver. Those odds placed the cellular-phone driver on the same level as a drunk driver. Several state legislatures and city halls proposed bills that would ban the use of cellular phones while driving.

NASA: PROBE FAILURES, HUBBLE, AND THE ISS

Facing criticisms early in the decade that it was wasting money, the National Aeronautics and Space Administration (NASA) announced that it would find ways to do more with less. The administration's mantra became "faster, better, cheaper." Indeed, NASA had great successes during the decade, including the space probe Mars Pathfinder, built for a tenth of the cost of its predecessors and hailed as a huge success. The probe landed on the surface of Mars on July 4, 1997, and released the Sojourner rover, the first independent vehicle to travel on another planet. The probe and rover sent back to Earth sixteen thousand images and 2.6 billion bits of information about the Martian terrain, including chemical analyses of rocks and the soil. In addition, John Glenn, U.S. senator and the first American to orbit Earth, returned to space in 1998 aboard the NASA space shuttle Discovery to a surge of public approval.

The space agency, however, did suffer some humiliating losses during the decade. The $194 million Mars Polar Lander, with the Deep Space 2 Probe, was launched on January 3, 1999 and lost on December 3 because of a software glitch in the probe's computer. Worse, perhaps, was the catastrophic failure of the Mars Climate Orbiter, launched December 11, 1998, and lost in September 1999. The primary cause of the failure was that the builder of the spacecraft, Lockheed Martin, provided one set of specifications in old-fashioned English units, while its operators at the NASA Jet Propulsion Laboratory were using the metric system. The report on the failure also uncovered management problems that let the mistake go undiscovered, including poor communication between mission teams, insufficient training, and inadequate staffing. With the "faster, better,

cheaper" approach, the navigation team was seriously overworked trying to run three missions at once.

One significant NASA project in the 1990s was the Hubble Space Telescope (HST), which began as a failure but ended as a success. Astronauts aboard the space shuttle Discovery in 1990 released the Hubble Space Telescope into orbit 370 miles above Earth. Because it did not have to look through the distorting prism of the atmosphere, the HST could view objects with greater brightness, clarity, and detail than any telescope on Earth. The mission initially appeared to be a failure, as scientists learned shortly after the HST began orbiting Earth that the curve in its primary mirror was off by just a fraction of a hair's width. This flaw caused light to reflect away from the center of the mirror. As a result, the HST produced blurry pictures.

In 1993, astronauts aboard the space shuttle Endeavor caught up with the HST and installed three coin-sized mirrors around the primary mirror, which brought the light into proper focus. In 1997, another space shuttle crew conducted general repairs to the HST. Then in November 1999, the HST stopped working after its gyroscopes broke down. Without the gyroscopes, the telescope could not hold steady while focusing on stars, galaxies, and other cosmic targets. A month later, during an eight-day mission, astronauts aboard the space shuttle Discovery installed almost $70 million worth of new equipment on the HST.

Despite the need for repairs, the HST has proven to be the finest of all telescopes ever produced. The thousands of images it has captured—a comet hitting Jupiter, a nursery where stars are born, stars that belong to no galaxy, galaxies that house quasars, galaxies violently colliding—have amazed astronomers. It allowed scientists to look deeper into space than ever before, providing evidence that supported the big bang theory (the idea that the universe was created in a violent event approximately twelve to fifteen billion years ago), and indicating that the universe could be younger than previously thought. It also identified disks of dust around young stars that suggested an abundance of planets in the universe, which might mean a greater chance of life in outer space.

The beginning of the International Space Station (ISS) project was another great achievement for NASA during the 1990s. A cooperative project involving the United States, Russia, Canada, Japan, and twelve other countries, the ISS was billed as a "city in space." First conceived by NASA in 1983, the project went through many design changes and consumed large sums of money before the first piece was ever built. As envisioned in the 1990s, the station would eventually extend more three times the length of a football field and weigh more than one million pounds when completed. It would serve as a permanent Earth-orbiting laboratory that will allow humans to perform long-term scientific research in outer space.

American Nobel Prize Winners in Chemistry or Physics

YearScientist(s)Field
1990Elias James CoreyChemistry
Jerome I. Friedman
Henry W. Kendall
Physics
1991No awards given to an American
1992Rudolph A. MarcusChemistry
1993Kary B. MullisChemistry
Russell A. Hulse
Joseph H. Taylor Jr.
Physics
1994George A. OlahChemistry
Clifford G. ShullPhysics
1995Mario J. Molina
F. Sherwood Rowland
Chemistry
Martin L. Perl
Frederick Reines
Physics
1996Robert F. Curl
Richard E. Smalley
Chemistry
David M. Lee
Douglas D. Osheroff
Robert C. Richardson
Physics
1997Paul D. BoyerChemistry
Steven Chu
William D. Phillips
Physics
1998Walter KohnChemistry
Robert B. Laughlin
Daniel C. Tsui
Physics
1999Ahmed H. ZewailChemistry

On November 20, 1998, a Russian Proton rocket blasted off from the Baikonur Cosmodrome in Kazakhstan, carrying the first piece of the station: the Zarya (Sunrise) module, designed to provide the initial power, communications, and propulsion for the station. The second piece, the Unity connecting module, was brought into orbit a month later by a U.S. shuttle. By the end of the decade, the joint program was awaiting the launch of the Russian service module that would house hundreds of astronauts and cosmonauts over the life of the station. The key source of energy for the station was to be solar panels.

Critics charged that the station was too expensive, with an overall initial cost of $40 to $60 billion, and an estimated $98-billion cost for the fifteen-year life of the project (scheduled to be completed in 2006). Other detractors said the station had little real utility. It would be too expensive to manufacture anything aboard, and scientific experiments could be done more cheaply if the station were automated. Supporters of the project argued that it would allow for unprecedented scientific experiments in the near-zero gravity of space and serve as a platform for space-based innovations in the twenty-first century.

About this article

The 1990s Science and Technology: Topics in the News

Updated About encyclopedia.com content Print Article

NEARBY TERMS

The 1990s Science and Technology: Topics in the News