Futures Imperfect: Technology and Invention in the Twenty-First Century

views updated

Futures Imperfect: Technology and Invention in the Twenty-First Century

Overview

The most dramatic predictions for the twenty-first century can be found in the fields of biology, computers, nanotechnology, and space exploration. Biology stands ready to create the biggest changes: engineering new species, cloning, growing replacement organs, extending life, and modifying the brain could radically alter our relationship with nature and our view of what it means to be human. Computers are set to shrink until they disappear into our homes, toys, vehicles, and even our bodies. They will be combined into a worldwide network that will be always on, always sensing, and always available. Nanotechnology—building molecular devices—offers possibilities ranging from dust-sized robots to strong, light cables that can give us an elevator into space. All of our own solar system will be in reach for exploration, colonization, and even tourism.

The art of predicting is highly flawed, with a history of laughable errors. There is a tendency to be too gloomy or too optimistic. Revolutionary changes cannot be accounted for. However, predictions do help shape the future, inspiring us to create new inventions, warning us about dangers, and stretching our imaginations.

Background

In 1863 Jules Verne wrote about Paris, 1960. He imagined motor cars, computers, fax machines, and elevators, but he also had his protagonist writing up the daily accounts in a large book with a quill pen. Predictions, even "scientific" prediction, aren't easy.

Astrologers have cast horoscopes and magicians have told fortunes for millennia, but until the nineteenth century prediction was largely concerned with reconfiguring the familiar. The revolutions of the 1700s produced a sense of social change, but material change remained beyond the imaginations of most. It was only in the 1800s—the age of invention that brought us steamboats, railroads, telegraphs, telephones, cars, and even motion pictures—that the concept of changes in day-to-day life became part of the popular culture.

Verne was at the vanguard of those predicting technological change. He wrote about submarines, flights around the world, and a trip to the moon. (The rocket was launched from Florida, but it was shot from a massive cannon.) H.G. Wells wrote about the atomic bomb in 1914. Rudyard Kipling wrote about airmail in 1904.

Writers didn't have all the fun. In 1900 the magazine Ladies Home Journal predicted a startling set of new technologies, including refrigeration, air conditioning, tanks, and television ("Persons and things of all kinds will be brought within focus of cameras connected electronically with screens at the opposite ends of circuits, thousands of miles at a span.") But the magazine also foresaw express mail via pneumatic tubes and quiet cities where all traffic was far above or below the surface.

The errors of prediction, born of pessimism, optimism, and the inability to anticipate breakthroughs can be amusing. Historian Henry Adams foresaw the end of the world by 1950. Household robots and a cure for cancer still have not arrived. With one exception, science fiction writers, who virtually owned the future until the 1940s, missed the advent of the birth control pill and the changes it brought to society.

In 1949 science fiction's best forecaster, Robert Heinlein, predicted not only the birth control pill but also air traffic control, cell phones, answering machines, and a dozen other advances that came true by 2000. Heinlein offered his forecasts at about the same time that the discipline of futurology—first suggested by Wells in 1901—was being established.

At first, the futurologist's approach was generally to find the right theory about social change and project from accumulated data. This method was not notably successful. In the 1950s Herman Kahn pioneered the scenario technique. He looked at possible, probable, and preferable scenarios with the idea of shaping rather than predicting the future. Computer modeling followed, and one of the most famous works of futurology was The Limits to Growth (1972), which made extensive use of modeling to suggest future scenarios. Many of the study's models have been proven to be wrong, but the book influenced the adoption of recycling and legal controls on pollution.

In addition to developing scenarios and using computer models, futurologists ground their work in trend analysis and extrapolation. (For instance, Moore's law, which states that computing power doubles every 18 months, is the bedrock of many technical forecasts.) They discover emerging ideas by scanning publications (both technical and social) and polling experts. They combine elements of change to see if they create unexpected or larger effects.

Predicting the Future

Prediction has become a mainstay of modern life. We expect change, and we expect our experts to tell us about it. We get forecasts not just for the next five days of weather but for inflation in the next quarter and future demographics given populations. Companies depend on visions, marketing plans, and sales projections.

Still, prediction remains more of an art than a science. There is no suite of rigorous tools for prediction. Informed intuition coupled with imagination may provide the highest success rate. With the end of the twentieth century, a flurry of predictions were made for the twenty-first century. There was no shortage of robots, flying cars, and missions to Mars, but most predictions dealt with the networking of the planet and changes to the human species.

Networking the Planet

As the impact of the Web continues to grow, many predictions for 2000-2100 deal with the reinvention and reshaping of our communities expected in the first half of the century. Central to these developments are advances in computers. Current technology is expected to follow Moore's Law until at least 2017, continuing to make devices smaller, faster, and cheaper. At the same time, communications capacity is doubling every nine months.

This means that early in the 2000s, everything—televisions, shoes, vegetables—gets smart and connected. Nanotechnology—building devices on a molecular level—will enable much of this development in our networked world. Prototypes have already been made of tiny gears, motors, sensors, diodes, and transistors. Smart dust and micromachines will keep a continuing tally on where everything and everyone is and what they're up to. The huge capacity and power of a worldwide computer system will be both invisible (within common devices) and omnipresent (gathering and providing information everywhere).

A big piece of the computing capacity will be given over to making information more accessible and easier to turn into value. Information systems will respond to speech, expression, and gestures. They will adapt to your style and your mode of learning and communicating. They will also adapt your environment to you, adjusting lighting, heat, music, odors, and the arrangement of devices. Information you care about will be discovered, collected, and delivered based on your electronic identity and the collective judgments of your peers. It may even be possible to change hardware to suit your needs, rewiring your cell phone into a memo pad on the fly.

Virtual reality will become more compelling and invade the real world as the years go by. First, the experience will become more realistic and the gloves and goggles less cumbersome. Later, direct neural stimulation by nanomolecular machines (devices the size of viruses) may make the equipment disappear into the users. By the end of the century, and perhaps much sooner, data and interpretation will overlay our sense experience, and it will be taken for granted that every real-world experience has a virtual aspect.

Changes to the Species

While networking relies on physics to transform our experience of community, advances in biology, which may dominate the second half of the century, are likely to touch us more personally. Some developments may even challenge the concept of what it means to be human.

Cloning is on the immediate horizon. So is tissue regeneration, with simple, lab-grown organs, such as bladders, undergoing testing. And 2001 is the year of the gene, thanks to the complete sequencing of the human genome. Together, these promise a world of not just better health but control in redesigning human biology. Rejuvenation, adding talent genes, and even creating new organs to see infrared or breath under water will all be within reach.

Scientists are taking on the challenges of Alzheimer's disease, substance abuse, and mental health, and their findings are also leading to a better understanding of the brain and how it works. Drugs to prevent senility are likely to be used more broadly to improve memory. Insights into dependency and behavioral disorders could lead to medicines that enhance creativity, math skills, or verbal expression. In the 2000s it will probably be possible, within a range, to choose your personality, mood, and intelligence.

Nanotechnology may play a role here. Besides enhancing our health—by sensing and reporting on changes, hunting down pathogens and cancer cells, or repairing arteries—synthetic devices could also augment our bodies and our brains. Nanodevices could provide electronic memory or make our brains into nodes on the information network. Human brain processes are already being mapped using external tools, like Magnetic Resonance Imaging and Positron Emission Tomography. Nanodevices could make this mapping more detailed, monitoring—and possibly even altering—our thoughts.

It may be more common to have artificial brain components than electronic eyes and mechanical arms and hearts, but we will probably have plenty of cyborgs among us. An aging population may prefer more durable, versatile manufactured parts to grown organic replacements. At some point during the 2000s, the distinction between cyborg humans and more and more human robots might blur. As early as 2020 a PC could have the processing power of a brain. This will be followed by 80 more years of development, with some of those artificial brains working 24 hours a day on their own development. Will relationships with our cyber children be less valid than those with our carbon-based kin? Already, millions of people have electronic identities. These include everything from credit and purchasing profiles to avatars that may bear little resemblance to their flesh and blood antecedents, but are dealt with seriously and socially by their peers. It might not be too much of a stretch to accept artificial intelligence identities as our friends and colleagues.

Immortality may be the biggest step humans will take in the 2000s. Some scientists predict that children born in the second decade will live two or more centuries. Most of these children will be born into a world that will have ten billion inhabitants by the century's close. Population numbers are among the most predictable figures for futurologists. Disruptive changes, like the discovery of intelligent aliens, are never predictable. In fact, a radical group of futurologists, science fiction writers, and other forecasters sees the future as totally unpredictable from about 2035 on. They envision a positive feedback loop of technical development. Around that time, the combination of powerful computers, the explosion of knowledge, and new technologies, such as nanotechnology, will create a world that is beyond our imaginations and beyond the imaginations of even the best futurologists.

PETER J. ANDREWS

Further Reading

Books

Clute, John and Peter Nicholls. The Encyclopedia of Science Fiction. New York: St. Martin's Press, 1995.

Kahn, Herman, William Brown & Leon Martel. The Next 200 Years: A Scenario For America and the World. New York: William Morrow & Company, 1976.

Verne, Jules. Paris in the Twentieth Century. New York: Del Rey, 1997.

Other

"The Future Gets Fun Again." Periodical article. Wired Magazine (February, 2000).

Sandberg, Anders. Transhumanist Resources. Internet site. http://www.aleph.se/Trans/


SCIENCE AS POPULAR ENTERTAINMENT

In the last half of the twentieth century science-related entertainment moved from the mythical mad scientist's laboratory to become a mainstay backdrop for popular entertainment. Instead of literary works such as Mary Shelley's Frankenstein that explored man's deepest fears about himself, science-related entertainment in the post-nuclear age became a bewildering array of radioactive monsters and aliens that often portrayed our fears of each other. The development of television in the nuclear age, particularly the extensive and live coverage of the space race to the Moon in the 1960s, put science on everyone's mind and science terminology on every tongue. Science, whether in the guise of science fiction or fantasy, became a big box-office draw around the world. Films such as ET, Close Encounters of the Third Kind, Star Wars and Jurassic Park touched on subjects ranging from extraterrestrial visitations to dramas set in galaxies far, far away to genetics and the possible ramifications of DNA-based technologies. One crossover hit, the popular Star Trek series, became an enduring multigenerational space-based adventure phenomena.

On a more scholarly note, growing audiences argued the meanings and insights found in classical science fiction movies and books written by Arthur C. Clark. In 1980 Carl Sagan's television series Cosmos became the most popular public television series of its time. Millions of viewers thrilled to Sagan's explanations of profound cosmic complexities and connections. A hauntingly beautiful musical score accompanied stunning visual effects in what was then a state-of-the-art presentation of the science and the history of science. Cosmos also appeared in book form and quickly became the best-selling science book ever published in the English language. Spurred by Sagan's success, other mainstream scientists, used to laboring only in academia, began to produce works for lay audiences. Stephen Hawking's Brief History of Time, a huge critical and commercial success, provided profound insights for both scientists and non scientists into the nature and origins of the universe.