Nanotechnology

views updated May 14 2018

Nanotechnology

THE HISTORY OF NANOTECHNOLOGY
TYPES OF NANOMATERIALS
BUSINESS APPLICATIONS
NANOTECHNOLOGY ETHICS AND RESPONSIBILITY
BIBLIOGRAPHY

Applied to various materials across many industries, nanotechnology is the science of the very small. Originally, it dealt with the engineering of nanoparticles to build mechanisms on an atomic level, but this has become only one definition, now referred to as MNT, or molecular nanotechnology. The meaning of nanotechnology has evolved to now include all scientific endeavors below micro technology, thereby encompassing any products and materials dealing with nanoscale operations. Due to the possibilities of nanotechnology in so many fields, the science has received increased attention from both businesses and media in recent years.

Nano refers to the infinitesimal nanometer, one billionth of a meter; at this level, the components of molecules, atoms, and their parts are large enough to be physically manipulated, arranged, and built into layers. At the technical level, nanotechnology is interested in using these molecular construction abilities to create machines and computers at this nanoscale. In theory, these tiny systems will be capable of incredible speed and atomic precision performance. On a more universal and practical level, nanotechnology can arrange molecules to help create every day, life-size products with new qualities such as weather resistance, conductivity, and enhanced efficiency.

The predicted uses for nanotechnology are myriad, covering nearly all industries from fashion to medical procedures, but in reality only a handful of marketable products are currently being created by nanotechnology methods. Certain cosmetics and sunscreens are amplified by nano-engineered substances derived from certain metals. Healthcare devices, such as testing tools and treatments, are being produced to carry entire laboratories on computer chip-sized areas. Coatings and fabrics are being created with nanotechnology protection. However, the large scale investment in nanotechnology still outpaces its current profits, if not its expectations.

THE HISTORY OF NANOTECHNOLOGY

Nanotechnology has had a long and colorful history, due to the large number of predictions made concerning its powers and world-changing abilitiesabilities which have been neither disproved nor fully confirmed in the continuing study of nano mechanisms. In 1959, Richard Feynman gave a famed speech to the American Physical Society in which he explored the possibility of engineering molecules to contain information (specifically writing); from there, Feynman extrapolated to mention possible biological and computorial applications. This speech was the first notable exploration into nanotech science.

The term nanotechnology was not coined until 1974, when Japanese Professor Norio Taniguchi used it to describe production technology capable of working in the nanometer range, such as the thin film deposition and ion beam milling used in semiconductor creation processes. In 1977, engineer K. Eric Drexler (1955) again used the word nanotechnology in his studies on molecular engineering and his influential books Engines of Creation :

The Coming Era of Nanotechnology and Nanosystems Molecular Machinery Manufacturing and Computation. Drexler was one of the first scientists to work with practical nanotechnology theory, creating such key concepts as the nano assembler, an atomic-scale machine that would be capable of replicating itself and creating other simple constructions. He also explored the biological and ethical implications in depending on nanotechnology.

In the 1980s and 1990s many processes vital to nano theory were formed. STM, or scanning tunneling microscopy, was invented in 1981 and allowed researchers to analyze and manipulate atoms. In 1985, spherical fullerenes were discovered and nicknamed buckyballs; these complex carbon molecules became the basis for the first practical nanoengineering. The late 1980s blossomed the science as the first books, journals, classes, and organizations of nanotechnology were created. Goals and research grants become much more common, and in 1993 the first Feynman Prize in Nanotechnology was awarded to Charles Musgrave. As the 1990s progressed, government interest and involvement in nanotechnology increased. The first nano company, Zyvex, was founded in 1997, the first possible designs of nanorobotics and nanobio ideas being developed around the same time.

Into the Twenty-First Century . The 2000s began several close inspections of the societal implications involved in the rising nanotechnology. Congressional hearings were conducted to investigate the dangers and necessary safety protocols, and the Nanotechnology R&D Act of 2003 required that all federally supported nanotechnology research consider the new societal and ethical implications of nanotechnology (SEIN). The year 2003 also saw the publication of the debate between scientists Eric Drexler and Rice University professor Richard Smalley (1943 2005) concerning the future of nanotechnology. Smalley, winner of a Nobel Prize in chemistry, argued against many of the entrenched ideas developed by Drexler, including the concept of molecular assemblers, which he thought badly conceptualized. To explain his argument, Smalley created the sticky fingers and fat fingers problems, which proposed that technicians could not be precise enoughgiven current nano theoryto create and maintain the sort of assemblers Drexler visualized. His counter-theories involved a much heavier emphasis on catalysts and chemical progression in nanotechnology. Smalley also disagreed with the rising speculation on the dangers of nanotechnology, especially the doomsday scenarios which he found to be impossible.

Mihail Roco of the U.S. National Nanotechnology Initiative has given a description of four stages in the evolutionary history of nanotechnology. The first stage, he suggested, was the passive stage, in which simple materials are made with passive nanotech abilities, such as coatings, fabrics, aerosols, and many construction elements. This is the stage which nanotechnology has currently reached. The second stage, active nanostructures, includes biological and computer-related nanosystems that can carry out specific operations. Biological nano-structures can consist of disease-targeting drugs, and computer structures can be mini-transistors and amplification systems. In the late 2000s, this is the stage nanotechnology is beginning to enter. The third stage, which the science is currently still experimenting with, sees nano-technology expanding to include nano systems that can assemble and plot various three-dimensional operating structures (simple robotics). The fourth stage occurs when nanotechnology reaches the creation of molecular nano-systems which can produce complex computer devices at the nano level.

TYPES OF NANOMATERIALS

There are many different types of nano-produced substances, each with its own abilities and uses in multiple industries. Most involve simple structures, all with functional properties:

  • Nanocomposites . Nanocomposites are materials that are infused into other objects so that products gain new properties or abilities. A silica-based nanocomposite is currently being used in experimental removal of toxic waste, creating a semiporous bundle of coated silica that can attach to some heavy metals and cleanse them from liquids such as water. Other more simple nanocomposites are being used in vehicle construction, especially plastic add-ons like bumpers, protectors, and steps. The composites are more resistant, stronger, less prone to damage, and more lightweight than the normal plastics they replace. Given these qualities, it is no surprise that many companies are beginning to use nanocomposites across different industries, such as the creation of sports equipment such as golf clubs and tennis rackets.
  • Nanocrystals . The nanoparticles infused into other materials are often nanocrystals, or molecules created in multiple-layered crystals with special characteristics. For instance, metal aligned in these crystalline forms has proven to be much stronger and longer lasting than normal metal composition. Due to the arrangement of nanocrystals, a metal nanocrystal structure is able to support itself so thoroughly that it becomes 100 to 200 percent stronger than more commonly used metal compositions. Other crystals are able to absorb light and refract it in a different color, a property useful in microscopic inspections. Because of these light-changing qualities, nanocrystals play an important role in solar-energy panels.
  • Nanoparticles . Nanoparticles are single nanomaterials that are not joined together in specific structures the way composites and crystals are, but instead are strewn along and amidst the affected item to give it certain abilities. Many of the new stain-warding fabrics used in khakis, dress shirts, and ties are made with nanoparticles attached to fibers; this allows spills or stains to slide off the material. Types of paint also make use of polymer nanoparticles to repel water. Types of medicine can also be administered using nanoparticular technology, which transforms normal pills into highly soluble, bioabsorbable materials made completely of nanoparticles.
  • Nanopowders . Nanopowder is part of a manufacturing process to create superior products. Often, nanopowders can be a beginning stage of materials that will be refined to become nanocomposites. An excellent example is the carbide nanopowder, the grains of which are 15 nanometers or less. This powder is used to make an alloy nearly as hard as diamond; this has multiple applications, from drill bits to bulletproof armor.
  • Nanoclays . Nanoclay is made of volcanic minerals, or smectite, structured into plates so thin they are approximately only one nanometer thick but several hundred nanometers in width and length. These nanoclays are often put through a process (known as intercalation) to make them more integratable into organic polymers. Once intercalated, nanoclays and polymers are mixed to create many kinds of composite products. Such products have the inherent strength and protective abilities common in nano-created materials. These clay composites account for about one quarter of the nano market today.
  • Nanocomposite coatings . These coatings are nanocomposite in form, made from synthetic nanocrystals or the carefully mixed nanoclays and polymers. However, instead of being fully integrated into the material, these coatings are preserved in liquid form and painted on the surface of items to instill nano qualities in them. Uses include the area of electronics, where a sprayed nanocomposite coating can protect the screens of phones and other handheld devices. The attractiveness of nano coatings in manufacturing is increased by their non-toxic nature and environmentally friendly composition, different from many of the chemicals used today.
  • Nanotubes . Most often made of carbon, nanotubes are structured in a honeycomb pattern that doubles over itself to form a tube-like shape. These tubes can be combined in many layers to form a fiber that is intensely strong and also has unique electrical properties. The way a nanotube is formed, molecule by molecule, can make it into either a type of semiconductor (such as silicon) or a metal. The semiconductive properties of nanotubes means that they could be used to make more efficient electronic viewing screens than those currently in use today.
  • Nanocatalysts . Nanocatalysts are used to begin chemical reactions that change the properties of materials. Many nanoparticles have exceptionally large surface areas relative to the total amount of material, and so they serve as a better catalyst, since a chemical catalyst depends on how much of one substance comes in contact with another. The incredibly small nanoparticles are able to affect catalysts very efficiently in processed reactions. Current experiments are using gel-based nanocatalysts to transform coal into oil products. The use of nanocatalysts as enzyme-like actors in chemical processes was a process supported by scientist Richard Smalley, and such reactions could be used in both biological manipulating and purification processes.
  • Nanofilters . Nanofilters operate by forming nanofibers of incredible complexity that can allow only certain particles through, based on what the nanofiber is made of and how tightly it is structured. The better types of nanofilters can attract and trap amazingly small particles, such as those micron-sized and below. This allows nano-designed filters to trap nearly all possible contaminants, letting only water or other liquids through. They also act much more efficiently than other kinds of filters, which allows water to pass through them at much higher speeds. Nanofilters can be used to help sterilize serums and protect people against biological warfare.

BUSINESS APPLICATIONS

Thanks to the theory called the precautionary principle, legislation has encouraged developing nanotechnology to progress very slowly, making sure that each experimental step is free of harmful possibilities. For this reason, many nanotech applications forecasted are being stalled as scientists work on removing flaws and perfecting processes. Many industries, such as pharmaceutical companies, are currently hesitant about embracing the promising technology when results are sure but still very slow. Also, due to the complexity of nanotechnology, the number of scientists working with it in companies is not increasing nearly as fast as the number of nano techniciansthe people who actually put the technology into practice

and develop, coat, and fuse nanoparticles to the products. Technician positions, then, are the fastest-growing positions in the nanotechnology field, creating an entirely new workforce class with new training required. Despite certain setbacks, there are several key areas in which nanotechnology is directly affecting the business world.

There is a small number of fields in which business investment in nanotechnology has reaped profitable rewards, and these areas can be expected to continue embracing nanotechnology and growing in size and influence. One of these promising fields is in the development of silver nanoparticles, which are currently being used in a wide number of products, from clothing and toys to food and food utensils. Since many substances gain new properties when reduced to nano-sized pieces, common elements such as silver can be used in innovate processesfor example, as an antimicrobial treatment. Nanoparticle silver, it turns out, kills germs and is easily spread, so it has found a strong market in products meant for young children, many types of food production, and simple utensils such as forks and chopsticks.

The coatings, fibers, and compositions developed through nanotechnology have become successful fields in which better materials have been and are being developed. Nanomedicine is also a very important field, although its current success is limited to drug delivery. Thanks to nanoparticle creation, drugs are being marketed that have greater longevity, more effective cell penetration, and specific disease targeting. This has lead to the production of several helpful oral medications. There are also promising early results from some of the more entrepreneurial nanomedicine experiments.

There are also several nanotech applications that affect businesses directly through manufacturing and technology development. In manufacturing and production, nanotechnology allows for the creation of incredibly detailed and smooth machine parts, literally constructed to molecular plans. Production machines can then run more quickly, needing far less maintenance and wearing down much more slowly.

Many nanotech applications have occurred in the technology field, especially in computer devices that can be constructed on a microscopic level to perform more efficiently or carry tasks they could not from normal production methods. Tolfree and Jackson, in their 2007 Commercializing Micro-technology Products, give several examples of computer systems effectivly built by nano-construction. Called micro-electromechanical systems (MEMS), these devices have been produced mostly in Germany, Japan, and the United States, and they include:

  • Ink-jet head markets, which are created through nanotechnology to be very precise. These ink-jet heads are being developed primarily by Hewlett-Packard, which has designed a model that does not require complete replacement.
  • Pressure sensors, which have some medical uses but are popular in the automotive industry. These MEMS can monitor tire pressure for automobiles, a technology that is growing more quickly than other sensor applications.
  • Silicone microphones, a line of products that is growing rapidly.
  • Accelerometers, which are used not only in vehicles but also a number of handheld applications, such as cell phones, global positioning systems, and simple pedometers. This is considered a strong market, but difficult to profit in because of price pressures and competition.
  • Gyroscopes, which like accelerometers are used in automobiles and phones.
  • Optical MEMS, pioneered by Texas Instruments, deal with imaging sensors and televideo communications.
  • Microfluidics is a fairly small market that works with polymers to create chemical reactions or detection screens.
  • Micro fuel cells are a new market that uses nanotechnology to form microscopic fuel cells that usually use hydrogen or methanol to create energy.

The twenty-first century has also seen the rise in nanotechnology patents, as more products and processes become open to nanotech possibilities, and more companies claim to be using nanotechnology to create their products and technology. Because of the myriad possible applications, it is difficult to be sure what companies are truly using nanotech abilities and what patents will prove successful.

Entrepreneurs have an exciting new field to invest in and explore with technology, but the new science also requires several important choices. Should entrepreneurs attempt to incorporate nanotechnology into existing business systems and manufacturing, or should they develop a new business structure to market purely nanotech products? How can performance and efficiency be improved by nano-designed properties? Amanda Kooser gives several guidelines for such entrepreneurial questions in her article, Think Big with a Nanotechnology Business:

  1. Look to the labs. The pioneering of nanotechnology is being conducted in company and university labs where scientists are exploring the possibilities of different particles. To begin an effective entrepreneurial endeavor in nanotechnology, it is important to take advantage of the newest discoveries, and these discoveries begin in the laboratory.
  2. Find the business connection. Nanotechnology will not be profitable unless it helps an existing industry or creates a new market. Entrepreneurs need to connect a particular business need with a solution that nanotechnology can currently provide.
  3. Leverage new discoveries. With all the hype concerning nanotechnology, many companies are curious to see what products and abilities will be developed next. Keeping an eye on new discoveries will lead to profitable solutions for business systems.
  4. Uncover the compelling reasons. Why should a business use the entrepreneur's nanoproduct? Or, if the entrepreneur is incorporating nanotechnology into an existing system, what are the clear benefits?
  5. Keep up on the latest and greatest. Subscribing to nanotechnology journals and attending nano conferences are great ways to keep up on the latest scientific events. For more simple research, there are many Web sites and several helpful books, too.

NANOTECHNOLOGY ETHICS AND RESPONSIBILITY

There are several concerns about nanotechnology, and most center on the fact that nanoparticles are recent creations, not fully understood, often with different properties than the same element in a normal size. The British Royal Society even petitioned, in 2004, for nanoparticles to be branded as new substances and so subject to stringent definition and testing. The common concern is that nanoparticles will have harmful, toxic effects on both humans and the environment. Many biological rules do not apply to nanoparticlesthey can travel easily through tissue, they are often not affected by antibodies, and they can accumulate in organs such as the brain. Most nano-particles can also be spread through the air and pass through filters that would keep larger particles out.

Because every new nanoparticle created has unknown, possible harmful biological consequences (especially in the long term), scientists apply what is now known as the precautionary principle to current nano research. Coined as a political and ethical standpoint, the precautionary principle states that any scientific endeavor that has any potential to cause harm to society or the environment should first be agreed upon as safe by a consensus of the applicable scientific community. In other words, if there is doubt, the research is out. However, despite ethic frameworks put into place to guide nanotech science, some argue that key issues have not yet been addressed.

A European team headed by scientist Stephen Hansen released a 2008 report that suggested several problems with current attitudes toward nanotechnology. In general, they said, scientists and governments have ignored early warning signs and stand to suffer from the same mistakes that have been made in the past with such science as atomic radiation. The call for more information can sometimes be used as an excuse for not taking action by involved governments. Hansen also pointed out that many governments that legislate rules for nanotechnology also openly support it, leading to accountability problems. The team also discovered that the critical ethical questions about nanotechnology do not seem to be answered by the current research.

The key to responsible discoveries in nanotechnology appears to be a mixture of applying the precautionary principle and being innovative in the uses of nanotechnology. Even with the enormous pressure placed on it by businesses and media, nanotechnology is a highly promising field that could provide an entire array of revolutionary products.

BIBLIOGRAPHY

Allholf, Fritz, and Patrick Lin. Nanotechnology and Society: Current and Emerging Ethical Issues. Springer, 2007.

Angelucci, Rocky. A beginner's guide to technology. Dallas Business Journal 7 Sept 2001.

Baker, Stephen, and Adam Aston. The Business of Nanotechnology. Business Week. 14 Feb 2005.

Berger, Michael. The Current Status of Nanotechnology-Based Therapeutics in Humans. Nanowerk, 2008. Available from: http://www.nanowerk.com/spotlight/spotid=6516.php,.

Berger, Michael. Late Lesson from Early Warnings for Nanotechnology. Nanowerk. Nanowerk LLC, 2008.

Jones, K.C. Nanocrystal Discovery Has Solar Sell Potential. Techweb News. Information Week, 2006.

Kanter, James. As nanotechnology gains ground, so do concerns. International Herald Tribune, 2008. Available from: http://www.iht.com/articles/2008/06/24/business/greencol25.php.

Kooser, Amanda. Think Big with a Nanotechnology Business. Entrepreneur.com, 2006. Available from: http://www.entrepreneur.com/startingabusiness/businessideas/article170774.html.

.A Little Risky Business. The Economist. 22 Nov 2007.

Mendelson, Abbey. The Big Business of Nanotechnology. Popcity, 2007. Available from: http://www.popcitymedia.com/features/0508nanotech.aspx.

The Nanotech Industry Is Moving from Research to Production. Business Wire, 14 Feb 2008. Available from: http://www.allbusiness.com/science-technology/materials-science/6786843-1.html.

A Short History of Nanotechnology. Foresight Nanotech Institute, 2008. Available from: http://www.foresight.org/nano/history.html.

Tolfree, David, and Mark J. Jackson. Commercializing Nanotechnology Products. CRC Press, 2007.

V.H. Crespi: Carbon Nanostructures. Penn State Physics, 2007. Available from: http://www.phys.psu.edu/people/display/index.html?person_id=202;mode=research;research_description_id=419.

What is Nanotechnology? Center for Responsible Nanotechnology, 2008. Available from: http://www.crnano.org/whatis.htm.

Nanotechnology

views updated May 29 2018

Nanotechnology

Nanofabrication techniques

Theoretical methods

Conventional nanoscale devices

Quantum-effect nanoscale devices

Tangible advances

Resources

Nanotechnology describes technologies where the component parts can measure just a few atomic diameters (generally around a millionth of a millimeter). The general goal of nanotechnology research programs is to reduce complex and sophisticated machinery into very small operational units.

Nanotechnology involves the development of techniques to build machines from atoms and molecules. The name comes from nanometer, which is a length equal to one-billionth of a meter. It involves the development of new electrical devices that depend on quantum effects that arise when the dimension of a structure is only a few atoms across. Because the techniques best suited for fabricating devices on the sub-micron scale originated in semiconductor processing technology for the production of integrated circuits, nanoscale devices are all based on semiconductors.

The theory of nanotechnology was the brainchild of American engineer K. (Kim) Eric Drexler (1955), who, as a student of genetic engineering at the Massachusetts Institute of Technology in 1971, envisioned the potential for building machines and materials with atoms as basic blocks just as living matter is constructed with deoxyribonucleic acid (DNA). Until the twentieth century, the physics of atoms and molecules were an invisible science. Atoms finally became visible in 1981 when the scanning tunnelingmicroscope was built. The microscope has the ability to scan through the clouds created by electrons around atoms so the individual atom can be seen. The doorway to nanotechnology was opened in 1985 by American chemist and physicist Richard Errett Smalley (19432005), a professor with Rice University, who found a way to produce carbon in a third form (the two natural forms are graphite and the diamond) that is crystalline and possesses incredible strength properties. The molecule contains 60 carbon atoms and has a shape much like a soccer ball or a geodesic dome built by American architect Buckminster Fuller (18951983), so the molecule was named the buckyball in Fullers honor.

By 1991, experiments with buckyballs led to long cylindrical strings of the molecules that are tube- or straw-like and are called buckytubes or nanotubes. Many scientists think nanotubes are a fundamental unit for building countless other nanodevices. Nanotubes can be flexed and woven and are being woven into experimental fibers for use in ultralight, bulletproof vests, as one example. Nanotubes are also perfect conductors, and they may be used to construct atomically precise electronic circuitry for more advanced computers and flat panel displays. They are important to the emerging field of molecular nanotechnology (MNT).

Nanofabrication techniques

Nanoscale devices are made using a combination of different fabrication steps. First is the growth stage, in which layers of different semiconductor material are grown on a substrate, providing structure in one dimension. Second is the lithography/pattern transfer stage, in which a pattern is imposed on a uniform layer, giving structure in the second and third dimensions. Repetition of these two stages results in the production of very complex, three-dimensionally nanoscale semiconductor devices.

Growth stage

In the growth stage, each successive layer has a different composition to impose electrical or optical characteristics on the carriers (electrons and holes). A variety of growth techniques can be used; molecular beam epitaxy (MBE) and metallo-organic chemical vapor deposition (MOCVD) have proved to be the most useful for nanostructures because they can be used to grow layers of a predetermined thickness to within a few atoms. In MBE, a gun fires a beam of molecules at a substrate upon which the semiconductor crystal is to be grown. As the atoms hit the surface, they adhere and take up positions in the ordered pattern of the crystal, so a near perfect crystal can be grown one atomic layer at a time. The mixture of material in the beam is changed to produce different layers; for example, the introduction of aluminum to the growth of gallium arsenide will result in the production of a layer of aluminum gallium arsenide.

Lithography/pattern transfer stage

Epitaxial growth allows the formation of thin planes of differing materials in one dimension only. The two-stage process of lithography and pattern transfer is used to form structures in the other two dimensions. In the lithography stage (see Figure 1), a film of radiation-sensitive material known as the resist is laid on top of the semiconductor layer where the structure is going to be made. A pattern is exposed on the resist using electrons, ions, or photons. The film is altered during the exposure step, thus allowing the resist to be chemically developed as a relief image. This image is then transferred to the semiconductor by doping (adding minute amounts of foreign elements), etching, growing, or lift-off as shown in Figure 1.

The exposure pattern can be written on the resist in a line-by-line manner using an electron or ion beam or it can be imprinted all at once using a mask, much like spray painting a letter on a wall using a template. Exposure by writing the pattern with a beam is especially useful for making prototype structures, because it avoids the expense of making a new mask for each pattern; features as small as a few nanometers can be written this way.

Writing patterns one line at a time is time consuming, however, and expensive when it comes to producing large quantities, so masks are used that expose a number of chips simultaneously. Masks can be used with electron or ion beams or with photons. It is important to note that the wavelength of light for exposing a pattern has to be less than the smallest feature being exposed.

Where nanoscale dimensions are involved, this necessitates the use of vacuum ultraviolet light or x rays. Penetration of the x rays into the semiconductor must be avoided to prevent damage to the crystal, so wavelengths of 1.3 nm or 4.5 nm are preferred because the polymeric resist exhibits an absorption depth of about 1 micron at these wavelengths.

The requirement on the wavelength of the x rays and the obvious need to have the x ray beam directed precisely necessitates the use of a well-controlled x ray source, such as a synchrotron. Laser-based x ray sources are currently being considered as a less expensive alternative to the synchrotron.

Theoretical methods

In order to progress in the fabrication and diagnostics of nanostructures and surfaces, scientists are developing more advanced spectroscopic and imaging methods. Highly charged ion surface interaction using an electron beam ion trap (EBIT) facility is one such method being examined by researchers in the Department of Physics at the University of Nevada and at the Lawrence Livermore National Laboratory. Researchers deposited a large amount of potential energy from single ions at the surface leading to localized surface defects of subnanosize. The comparison of the computer simulation with experimental x-ray spectra showed the formation of subsurface hollow atoms.

Conventional nanoscale devices

In conventional semiconductors, the carrier velocity is limited to about 47 in/sec (107 cm/sec), which results in limitations of current density and turn-on time, or frequency response. Because of the need for faster devices (it takes less time for a carrier to cross a smaller device) and the desire to squeeze more devices onto a single chip, e.g. a processor chip for computers,

significant advances have been made since the 1970s in fabricating smaller devices.

Additional advantages have been found for operating devices in the nanoscale regime. The carrier velocity limit of 47 in/sec (107 cm/sec) is set in large-scale semiconductors by collisions with crystalline defects, photons, etc., and can be characterized by a mean free path between collisions: the longer the mean free path, the fewer collisions and the higher the carrier velocity. It has been found that in nanoscale devices, the device itself can be considerably shorter than the mean free path, in which case a carrier injected at a high velocity, say 48 in/sec (108 cm/sec), never suffers any collisions and, therefore, does not slow down. This phenomenon is termed ballistic transport and allows semiconductors to operate far faster than was possible before, reaching frequencies of 200 Ghz (gigahertz).

There are complexities associated with the manufacturing of conventional nanoscale devices that appear to be major obstacles on the way to producing large quantities of these types of devices. This difficulty in manufacturing, coupled with the fact that ballistic transport devices perform best at low signal levels (as opposed to conventional electronics that operate well at high power levels) suggests that these new devices will not become widely available for use in computer chips in the near future.

Quantum-effect nanoscale devices

According to the laws of quantum mechanics, free carriers in a metal or semiconductor can only take on specific values of energy, as defined by the crystal structure; that is, the energy is quantized. For most practical purposes, there are so many closely spaced energy levels, it appears that the carriers have a continuum of possible energies, except for the well-defined gaps characteristic of semiconductors. When the carrier is confined to a region where one or more of the dimensions reach the range of less than 100 nm, the quantum energy levels begin to spread out and the quantum nature becomes detectable. This reduction in size can take place in one, two, or three dimensions, using the fabrication techniques discussed earlier, yielding structures known respectively as superlattices, quantum wires, and quantum dots.

When electrons are introduced into a semiconductor structure, they migrate to those positions where their energy is lowest, much like a ping-pong ball will come to rest in a dimple on a waffled surface. If the nanostructure is engineered correctly, then the electrons will settle in the nanostructure itself and not in the adjacent layers. These carriers will then exhibit the quantum effects imposed on them by the nanostructure. The ability to engineer artificial atoms and molecules in semiconductors using nanofabrication techniques has resulted in a powerful new tool in creating novel semiconductor devices, such as quantum dots where the number of carriers trapped by the dot can be controlled by an external voltage.

It appears possible that nanoscale quantum-effect devices may become widely used in complex electronic systems, such as a neural array of quantum dots spaced only a few 100 nm apart, but this will only take place after significant progress has been made in fabrication and tolerance.

Scientists see a universe of potential in nanotechnology, following years or perhaps decades of research and development. Some of the applications they foresee are as follows: surgical instruments of molecular scale that are guided by computers of the same size; rocket ships for the individual made of shatterproof materials created by nanomachines; synthesis of foods by molecules and an end to famine; pollution-free manufacturing and molecular devices that clean up existing pollution without human intervention; consumer goods that assemble themselves; reforming of soil (termed terraforming) both on the Earth and other planets where soil and rock may not exist; and computers capable of more computations in one second than all the existing semiconductor devices in the world combined. Nanodevices may create smart matter that, when used to build a bridge or a high-rise building, knows when and how to repair itself; diamonds of perfect quality and any size may be built atom by atom to suit industrial needs or an individuals ideal; injectable molecular robots that enter the bloodstream on seek-and-destroy missions for cancer, AIDS (acquired immune deficiency syndrome), invading bacteria or viruses, and arterial blockages. Similarly, nanoparticles might carry vaccines and drugs directly to the source of the ailment.

Tangible advances

A research team led by Chad Mirkin, Charles E. and Emma H. Morrison Professor of Chemistry and director of Northwesterns Center for Nanotechnology (Evanston, Illinois), developed a method of writing nanostructures with what they have termed the worlds smallest plotter. The device writes multiple lines of molecules, each line only 15 nm or 30 molecules wide and only 5 nm or about 200 billionths of an inch apart. The plotter is based on the researchers dip-pen nanolithography (DPN) technique. The dip-pen nano-lithography draws tiny lines with a single ink or type of molecule. The nano-plotter prints multiple inks, or four different kinds of molecules, side by side while retaining the chemical purity of each line. DPN, described in the January 29, 1999, issue of Science, turns an atomic force microscope (AFM) into a writing instrument. Researchers first apply an oily ink of octadecanethiol (ODT) to the AFMs tip. When the tip is brought into contact with a thin sheet of gold paper, the ODT molecules are transferred to the golds surface via a tiny water droplet that forms naturally at the tip. The new nanoplotter multiplies this technique, laying down a series of molecular lines with extreme accuracy. While the microfabrication of electronic circuits and other products currently use solid-state or inorganic materials, innovations such as the nanoplotter will direct future technologies toward the use of organic and even biological materials. Since its introduction, DPN has provided high-resolution patterning capabilities for many molecular and biomolecular inks on several substrates, such as semiconductors, metals, and monolayer functionalized surfaces. As of 2006, the Mirkin Group at Northwestern uses DPN to discover new information within surface science and to create technologically unique nanostructures.

Coupling the organic and inorganic, biological engineers at Cornell University (Ithaca, New York) demonstrated the feasibility of small, self-propelled bionic motors that do their builders bidding in plant, animal, or human cells. Such machines could travel through the body, functioning as mobile pharmacies dispensing precise doses of chemotherapy drugs exclusively to cancer cells, for example. The device, the result of integrating a living molecular motor with a fabricated device at the nano scale, is a few billionths of a meter in size. The first integrated motor, a molecule of the enzyme ATPase coupled to a metallic substrate with a genetically engineered handle, ran for 40 minutes at three to four revolutions per second.

Researchers at Rensselaer Polytechnic Institute (Troy, New York) are working with materials comprising

Key Terms

Ballistic transport Movement of a carrier through a semiconductor without collisions, resulting in extraordinary electrical properties.

Carriers Charge-carrying particles in semiconductors, electrons, and holes.

Epitaxy The growth of crystalline layers of semiconducting materials in a layered structure.

Integrated circuits Complex electronic circuits fabricated using multiple growth and lithography/pattern transfer stages to produce many miniature electronic elements on a monolithic device.

common atoms arranged in grains less than 100 nm in diameter10,000 times smaller than grains in conventional materials. These grains are used as building blocks to create materials with entirely new properties, which the researchers predict could revolutionize everything from drug delivery to sunscreens.

Although it is still in the early stage of development, not unlike that of computer and information technology in the 1950s, nanotechnology is a rapidly expanding scientific field.

Defense programs in many countries are now concentrating on nanotechnology research programs that will facilitate advances in programs such as those designed to create secure but small messaging equipment, allow the development of smart weapons, improve stealth capabilities, develop specialized sensors (including bio-inclusive sensors), create self-repairing military equipment, and improve the development and delivery mechanisms for medicines and vaccines.

See also Microtechnology.

Resources

BOOKS

Mulhall, Douglas. Our Molecular Future: How Nanotechnology, Robotics, Genetics, and Artificial Intelligence Will Change Our World. Amherst, NY: Prometheus Books, 2002.

Ratner, Mark A., and Daniel Ratner. Nanotechnology: A Gentle Introduction to the Next Big Idea. Upper Saddle River, NJ: Prentice Hall Publishers, 2002.

PERIODICALS

Bennewitz, R., et al. Atomic scale memory at a silicon surface. Nanotechnology 13 (2000): 499502.

Lee, K. B., et al. Protein Nanoarrays Generated by Dip-Pen Nanolithography. Science. vol. 2002, 295(5560), 1702-1705.

Lim, J-H, et al. Direct-Write Dip-Pen Nanolithography of Proteins on Modified Silicon Oxide Surfaces. Angew Chem. Int. Ed. vol 2003, 20, 24311-2414.

Zhang, H., et al. Biofunctionalized Nanoarrays of Inorganic Structures Prepared by Dip-Pen Nanolithography. Nanotechnology. vol. 2003, 14, 1114-1117.

OTHER

National Science and Technology Council. National Nanotechnology Initiative <http://www.nano.gov/index.html> (accessed October 18, 2006).

Iain A. McIntyre

Nanotechnology

views updated May 18 2018

NANOTECHNOLOGY

•••

Imagine a world in which manufacturing and medical treatments take place solely at a molecular level, a world in which human bodies are reengineered to include more durable tissues or to reverse past injuries. These are some of the dreams motivating scientists and engineers pursuing the field of nanotechnology. As the name implies, nanotechnology involves the engineering or manipulation of matter, and life, at nanometer scale, that is, one-billionth of a meter. (Ten hydrogen atoms side by side span 1 nanometer; the DNA molecule is 2.3 nanometers across). If feats such as those mentioned above were possible, then the structures of the human body and the current tools of humankind could be significantly altered. In recent years many governments around the world, including the United States with its National Nanotechnology Initiative, and scores of academic centers and corporations have committed increasing support for developing nanotechnology programs (Glapa).

The Birth of an Idea

The idea behind nanotechnology originated with Nobel laureate Richard Feynman in a speech he gave to the American Physical Society in 1959. He described the development of tools for molecular engineering, whereby things would be built molecule by molecule. He proposed, as a challenge to his colleagues, the writing of the entire Encyclopedia Britannica on the head of a pin. His startling claim was that this sort of task would not require a new understanding of physics and was completely compatible with what scientists already understood about the nature of force and matter. Little was done in response to the Feynman challenge until the publication of works by K. Eric Drexler in the 1980s and 1990s. Drexler demonstrated the feasibility of such manipulation from an engineering perspective and provided a vision for the possible benefits of such technologies.

What Could Nanotechnology Do?

The list of potential uses of nanotechnology continues to expand. The primary focus of research at this point concerns miniaturization of electronic components (Bachtold et al.; Hornbaker et al.), but nanoscale materials may dramatically improve the durability of materials used in machinery and could result in less polluting and more efficient production methods. The U.S. military has a significant interest in nanotechnology and has created the Institute for Soldier Nanotechnologies (ISN). Among the initial aims of the ISN is to create stealth garments (and coatings) that are difficult to see or detect, are highly durable, and provide increased protection from penetrating objects. The institute aims to develop devices to rapidly and accurately detect biological or chemical weapon attacks. The ISN is also interested in using nanotechnology to help seamlessly integrate electronic devices into the human nervous system—creating the cyborg soldier.

There are many possible medical uses of microscopic, subcellular machines. Medical applications of nanotechnology include rational drug design; devices specifically targeting and destroying tumor cells (McDevitt et al.) or infectious agents; in vivo devices for the manufacture and release of drugs and for tissue engineering or reengineering at the site of need; early detection or monitoring devices; in vitro diagnostic tools amounting to a laboratory on a chip (Park, Taton, and Mirkin); devices to clear atherosclerotic lesions in coronary or cerebral arteries; and biomimetic nanostructures to repair or replace DNA or other organelles. Nanotechnology might be used to provide artificial replacements for red blood cells and platelets (Freitas, 1996), to augment or repair interaction between neurons in the brain, to improve biocompatibility and the interface between brain tissue and cybernetic devices, and to develop more durable prosthetic devices or implants (Drexler, 1986; Drexler and Peterson; Freitas, 1999; Crandell; BECON). Such tools have also been envisioned to provide new means of cosmetic enhancement, such as controlling weight, changing hair or skin color, removing unwanted hair, or producing new hair simulations (Crawford). Also, some of the potential therapeutic uses previously listed would lead to more effective treatment of life's greatest killers, such as cancer, infectious disease, and vascular disease, leading in turn to greatly enhanced human lifespans.

One other possible project to arise from nanotechnology has become the focus of a rigorous debate among members of the nanocognoscenti. This controversial device is the self-replicating assembler, which was first envisioned by Drexler in 1986. The assembler is in essence a form of artificial life, for not only would it manipulate its environment on a molecular or atomic level, as other nanomachines would, but it would also be coded and designed to replicate itself, potentially making endless copies of itself. Alternatively, nanomachines could be designed to function more as viruses, using the mechanisms in other living cells to help duplicate constituent parts and assemble them into a new machine. While it is beyond the scope of this entry to detail the elements of the debate between those who contend such devices can and will be developed and those who adamantly claim that Drexlerian assemblers are a physical impossibility, the assembler is an excellent starting point for the discussion of the ethical aspects of nanotechnology.

Ethical Issues

The ethical issues of nanotechnology can be grouped into five categories:

  1. the challenges of prospective technology assessment and regulation;
  2. environmental impact of nanotechnologies;
  3. issues of justice and access to the goods and services that might accrue from nanotechnology;
  4. the ethical and social implications of increased longevity that might result from medical nanotechnology; and
  5. the issues of augmentation or enhancement of human attributes and function.

Accidents, Abuses, and Regulation

The vision for medical uses of nanotechnology is exciting, and if only a portion of the proposed devices prove possible, nanotechnology may benefit many thousands of patients. Any device that can operate on the subcellular level, however, can just as easily be designed to destroy as to repair or heal. In fact, it will be far easier to develop devices that kill. One of the first applications of medical nanotechnology involves a device that can target and destroy cancer cells. Despite the arguments over the feasibility of creating assemblers, it is not a far stretch to envision nanoscale weapons that could be borne on the winds or delivered through the water or food supply. Even if not self-replicating, such devices, with appropriate targeting or with the ability to synthesize toxic substances once inside the host could prove to be quite lethal or disabling. If assemblers were ever created, with the ability to self-replicate like bacteria, then the level of personal or environmental harm could be substantial.

Concern over the potential military or terrorist use of such technology, which could ultimately be fairly cheap to produce, and thus impossible to sufficiently regulate once in existence, has led some (even within the technology community) to contend that the only safe way to proceed is to choose not to develop the tools and methods of nanotechnology at all (Joy). In this view, the only way to prevent the potential devastating harms of a technology, or the consequences of malicious use of knowledge and technology, is to not develop the technology, or acquire the knowledge, in the first place. Arguments of this type, however, assume the burden of proving:

  1. that the projected abilities of the device in question are possible to achieve;
  2. that the feared harms cannot be prevented, controlled, or mitigated to an acceptable degree;
  3. that it is feasible to achieve universal consensus that the area of technology and/or knowledge in question should not be pursued; and
  4. that such a prohibition can be sufficiently policed.

In the case of the first issue, it seems very likely that biological nanodevices will be developed, most likely using a so-called bottom-up approach. That is, existing biological molecules and organelles will be used as models for creating tools to achieve the desired function, or these "natural" materials will be used in new ways. An example of this is a project that involved the conversion of the ATPase molecule, ubiquitous in living cells, into a molecular motor (Soong et al.). Therefore, because the development of functioning biomechanical nanodevices is highly probable, it is morally imperative to prospectively evaluate the possible impact of these technologies as they are being developed, so that appropriate safeguards can be implemented to protect against accidents, unanticipated consequences, or inappropriate uses of the technology.

While many disagree with Joy's conclusions, his concerns for the potential harms that autonomous technology could produce are legitimate. It is his response to the second issue, the likely ability or inability to control or protect against foreseeable or unforeseeable harms, that has led to the most dissent. Concerns have been raised that autonomous, self-replicating assemblers could escape control, and/or mutate, in such a way as to destroy life and the environment on a massive, cataclysmic scale. This is Drexler's (1986) socalled "gray goo scenario." In a 2000 article, however, Robert A. Freitas Jr. calculated that this nightmarish scenario is unlikely because of the ability to detect the activity of such biovorous devices early on and to neutralize them. In the early days of recombinant DNA research, there were many concerns about releasing lethal plagues into the environment, quite similar to a number of the concerns being voiced about nanotechnology. Yet the scientific community responded strongly and wisely to the challenges of DNA research, establishing procedural safeguards that remain in use (Krimsky; Fredrickson) and that serve as a model for developing and containing potentially harmful technologies.

Pursuing a similar course of prospective risk assessment and guideline development, the Foresight Institute published the "Foresight Guidelines on Molecular Nanotechnology" in 2000. The guidelines remain voluntary recommendations, but they could be used as a framework for formal regulation and licensing of biologically active nanodevices. Some of the recommended design principles include: (1) dependence on a single fuel source or cofactor that does not exist in the natural environment; (2) requiring constant signaling from an external source for the device to continue functioning; and/or (3) programming termination times (similar to apoptosis in living cells). While it is hopeful that all responsible researchers and engineers would embrace suggestions such as these, there will need to be formal regulation with serious economic, licensure, and punitive penalties for failure to comply. Additionally, the granting of licenses to perform research in nonlaboratory settings or to market nanodevices, as well as the awarding of patents, should be contingent upon proof of the ability to detect and destroy the devices in both in vitro and in vivo settings.

The idea that humankind could reach universal agreement to limit or forbid certain areas of research is naive, and very unlikely to happen, particularly when the field of knowledge in question may lead to vast improvements in health, lifespan, productivity, and so on. Even if consensus could be achieved, policing such restrictions will be essentially impossible. The force of curiosity, as well as the stubborn human heart's universal propensity to rebel against restriction, will ensure that the research will indeed take place, just not as rapidly as it might have otherwise. Rather it is wiser to direct the development of the technology in such a way as to prepare defenses concurrently along with the devices themselves. It is only in this way that humankind and individual societies can be prepared to meet the threats of terrorism, accidents, and other calamities resulting from the creation and/or abuse of a particular technology.

The Nano-Improved Human

As mentioned above, medical nanotechnology may provide exciting tools for healing injured tissue, repairing DNA, and treating neoplastic and infectious diseases, as well as for cosmetic applications. It is conceivable that some nanodevices may also be used to strengthen normal tissue; to manipulate certain DNA strands to alter traits; or to augment mental function, either via enhanced electronic interfaces at the cellular level or by direct stimulation of certain neural pathways. These latter possibilities immediately bring up difficult questions.

Should such uses of bionanotechnology be permitted? If they should, should the medical profession be involved with nonhealing, elective augmentation, and if not, then who should? Should people be allowed to use health insurance to cover the cost of such interventions? How can just access be ensured otherwise? Such augmentations, if successful, would create significant differentials in performance in the workplace, physical abilities, and so on. Consequently, the wealthy would get stronger and wealthier, further increasing their advantage over those who might not be able to afford the technology in question.

In his 1999 book Nanomedicine, Freitas suggested that nanotechnology, and by implication other potentially augmenting technologies, requires a new concept of disease that transcends the classic model of disordered function. He calls this new model the volitional normative model of disease, and he described it as follows:

Disease is characterized not just as the failure of "optimal" functioning, but rather as the failure of either (a) "optimal" functioning or (b) "desired" functioning. Thus disease may result from:

  1. failure to correctly specify desired bodily function (specification error by the patient),
  2. flawed biological program design that doesn't meet the specifications (programming design error),
  3. flawed execution of the biological program (execution error),
  4. external interference by disease agents with the design or execution of the biological program (exogenous error), or
  5. traumatic injury or accident (structural failure). (Freitas, 1999, p. 20)

While encompassing traditional understandings of disease, this model additionally takes disease out of the context of an objective pathophysiological assessment and turns disease into whatever the patient defines it to be. Any limitation or undesired trait may now be declared disease. Though ostensibly continuing the contemporary trend of patient self-determination to a new level, this approach is fraught with both danger and injustice. To declare that a condition is disease imposes a moral claim that services ought to be rendered for its modification, elimination, or amelioration. The balance between beneficence (the obligation to do good) and nonmaleficence (the obligation to prevent harm) may be inappropriately tipped to what the patient desires, rather than needs. Physicians would be reduced to agents of wish fulfillment and to technicians, rather than remaining healers. These issues already exist to some degree in the area of cosmetic surgery but will expand to involve most other areas of medicine as well. Further, claims to "treatment" would unjustly deplete healthcare resources and funds, potentially depriving those in real need of legitimate healing.

Conclusion

Nanotechnology offers exciting new tools for materials processing, more powerful and integrated electronic devices, and new medical therapies. Nanodevices, however, may also become instruments of harm, and they require prospective regulation and engineering to prevent both foreseeable and unforeseeable negative consequences. Nanodevices join a number of other developing technologies that offer the potential to alter or augment the human body. A prospective, widespread discussion of the implications of these technologies for the human species, the profession of medicine, and the world's communities should occur as soon as possible.

c. christopher hook

SEE ALSO: Biomedical Engineering; Cybernetics; Enhancement Uses of Medical Technology; Human Dignity; Transhumanism and Posthumanism

BIBLIOGRAPHY

Antón, Philip S.; Silberglitt, Richard; and Schneider, James. 2001. The Global Technology Revolution: Bio/Nano/Materials Trends and Their Synergies with Information Technology by 2015. Santa Monica, CA: RAND.

Bachtold, Adrian; Hadley, Peter; Nankanishi, Takeshi; et al. 2001. "Logic Circuits with Carbon Nanotube Transistors." Science 294: 1317–1320.

Crandell, B. C., ed. 1996. Nanotechnology: Molecular Speculations on Global Abundance. Cambridge, MA: MIT Press.

Crawford, Richard. 1996. "Cosmetic Nanosurgery." In Nano-technology: Molecular Speculations on Global Abundance, ed.B. C. Crandell. Cambridge, MA: MIT Press.

Drexler, K. Eric. 1986. Engines of Creation. New York: Anchor.

Drexler, K. Eric. 1992. Nanosystems: Molecular Machinery, Manufacturing, and Computation. New York: Wiley.

Drexler, K. Eric, and Peterson, Christine. 1991. Unbounding the Future: The Nanotechnology Revolution. New York: Morrow.

Fredrickson, Donald S. 2001. The Recombinant DNA Controversy: A Memoir. Washington, D.C.: ASM Press.

Freitas, Robert A., Jr. 1999. Nanomedicine, vol. 1: Basic Capabilities. Austin, TX: Landes Bioscience.

Grethlein, Christian E., ed. 2002. "DoD Researchers Provide a Look Inside Nanotechnology." Special issue of AMPTIAC Quarterly 6(1).

Hornbaker, D. J.; Kahng, S.-J.; Misra, S., et al. 2002. "Mapping the One-Dimensional Electronic States of Nanotube Peapod Structures." Science 295: 828–831.

Joy, Bill. 2000. "Why the Future Doesn't Need Us." Wired, April, pp. 238–262.

Krimsky, Sheldon. 1982. Genetic Alchemy: The Social History of the Recombinant DNA Controversy. Cambridge, MA: MIT Press.

McDevitt, Michael R.; Ma, Dangshe; Lai, Lawrence T.; et al. 2001. "Tumor Therapy with Targeted Atomic Nanogenerators." Science 294: 1537–1540.

Mulhall, Douglas. 2002. Our Molecular Future: How Nanotechnology, Robotics, Genetics, and Artificial Intelligence Will Transform Our World. Amherst, NY: Prometheus.

Park, So-Jung; Taton, T. Andrew; and Mirkin, Chad A. 2002. "Array-Based Electrical Detection of DNA with Nanoparticle Probes." Science 295: 1503–1506.

Roco, Mihail, and Bainbridge, William Sims, eds. 2001. Societal Implications of Nanoscience and Nanotechnology. Arlington, VA: National Science Foundation.

Soong, Ricky; Bachand, George; Neves, Hercules; et al. 2000. "Powering an Inorganic Nanodevice with a Biomolecular Motor." Science 290: 1555–1558.

INTERNET RESOURCES

BECON (NIH Bioengineering Consortium). 2000. "Nanoscience and Nanotechnology: Shaping Biomedical Technology: Symposium Report, June 2000." Available from <http://www.becon.nih.gov/poster_abstracts_exhibits.pdf>.

Feynman, Richard. 1959. "There's Plenty of Room at the Bottom." Available from <http://www.zyvex.com/nanotech/feynman.html>.

Foresight Institute. 2000. "Foresight Guidelines on Molecular Nanotechnology." Available from <http://www.foresight.org/guidelines/>.

Freitas, Robert A., Jr. 1996. Respirocytes: A Mechanical Artificial Red Cell: Exploratory Design in Medical Nanotechnology. Foresight Institute. Available from <http://www.foresight.org/Nanomedicine/Respirocytes.html>.

Freitas, Robert A., Jr. 2000. "Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations." Foresight Institute. Available from <http://www.foresight.org/NanoRev/Ecophagy.html>.

Glapa, Steven. 2002. "A Critical Investor's Guide to Nanotechnology." In Realis. Available from <http://www.inrealis.com/nano.htm>.

Institute for Soldier Nanotechnologies. 2003. Available from <http://www.aro.army.mil/soldiernano/>.

National Nanotechnology Initiative. 2003. Available from <http://www.nano.gov/>.

National Science and Technology Council. 2003. "Nanotechnology: Shaping the World Atom by Atom." Available from <http://itri.loyola.edu/nano/>.

Nanotechnology

views updated Jun 27 2018

Nanotechnology

HISTORY AND HYPERBOLE

DEMOCRATIC AND MORAL RESEARCH

BIBLIOGRAPHY

Nanotechnology, or nanotech, is a collective term for several dozen related techniques that manipulate and manufacture molecules that are measured by the nanometer (one-billionth of a meter, or 10-9m, in scientific notation). Instruments invented in the early 1980s now enable scientists to observe and rearrange molecules and atoms as never before, thereby enriching our knowledge of the world of the nanoscale. Viruses and atomic surfaces, for example, are understood much better than before, while carbon atoms are arranged into new shapes, including spheres and tubes. Because of its ability to rearrange the building blocks of matter, nanotech has great potential to affect medicine, information technology, materials science, the environment, and other areas. Developments in medical diagnostics and therapeutics, along with smaller, faster computers, are especially exciting, while toxicity and threats to privacy are uncertain but worrisome. Social scientists are interested in nanotech because it also affects economic, cultural, social, and political conditions.

While the scientific basis of nanotech is many decades old, the policy framework emerged in the late 1990s and early 2000s as governments organized public-sector funding and encouraged private-sector investments. Some social scientists began to study nanotech at that time. Many had previously studied biotechnology or information technology, so they brought mature research methods and sophisticated insights to nanotech. Their work consisted not only of observing the emergence of a new technology, but also of trying to influence its direction before society became locked into an unfortunate trajectory of technological determinism. Furthermore, the status of the scholarly literature has been dynamic. Commentaries on nanotech in the sciences, the humanities, or the social sciences become outdated very quickly.

HISTORY AND HYPERBOLE

Four kinds of issues are especially prominent in social science research on nanotech. First there are several contested histories of nanotechnology. One version says that nanotech began with a prescient talk in 1959 by Richard P. Feynman (1918-1988), the 1965 Nobel laureate in physics. Another points to the invention of the scanning tunneling microscope by IBM scientists Gerd Binnig and Heinrich Rohrer in 1981. A third narrative emphasizes the vision popularized by K. Eric Drexler in his 1986 book Engines of Creation. A fourth indicates that the underlying science was well established but intellectually diffuse until January 2000 when President Bill Clinton gathered many strands into one agenda called the National Nanotechnology Initiative (NNI).

In weighing these narratives, the critical perspectives of the social sciences reveal an ideological landscape of explicit and implicit discourses, with much competition to establish definitions, iconic images, and authoritative meanings, not to mention priorities for government funding. The Feynman origin theory appeals to quantum physicists and some people in the California Institute of Technology community, where Feynman taught for more than thirty years. The account that begins with the scanning tunneling microscope is preferred by the IBM community and most nanoscientists other than quantum physicists. The Drexler story is more credible outside of scientific circles than within because it delivers limitless promises of technological salvation, although several scientists credit Drexler for inspiring their scientific work. Finally, the NNI version demands that nanotechnology produce tangible products quickly, which justifies generous government support for science and technology, plus government cheerleading for private-sector developments. It also draws upon a sense of economic nationalism: The NNI is a way for the United States to maintain economic and technological leadership. This story has parallel versions elsewhere, particularly in a series of European Union plans to unify European nanotechnology.

A second set of issues involves the power and consequences of hyperbole, both for and against nanotechnology. Nanotech evokes some intense interpretations of culture and technology: The so-called nano visionaries describe a magical set of tools that will transmute matter, end death, and perform other amazing changes, while their counterparts on the other end of the ideological spectrum preach that nanotech leads to the end of humanity, the end of our environment, and other evils.

These forms of hyperbole cause one to wonder whether they are grounded in the scientific and technical realities of nanotech. Or, do they express hopes and fears unrelated to nanotech reality, but gratuitously superimposed on nanotech? Another question is the changing relation between the technology and the hyperbole. Is the antinano hype discredited when beneficial applications come into our lives? Is there a nanophobic backlash: Do policymakers and nonexperts feel deceived by extravagant promises that turn out to be unrealistic? Social scientists have been tracking these questions of hope and trust. Furthermore, changes in relations between the technology and the hyperbole do not necessarily constitute a victory of one form of hype over another. They can also take the form of centrist positions displacing either form of hype.

DEMOCRATIC AND MORAL RESEARCH

The third cluster of questions appeals to the conscience of the social sciences, for these are the issues of justice and the common good. Nanotechnology has consequences for power, wealth, privacy, trust, discrimination, and other moral questions that animate social scientists. One topic is especially salient, namely, the longstanding problem of how a democratic society uses democratic processes to make science policy. The philosopher and educator John Dewey (1859-1952) argued that when a democratic society makes science policy, it needs many citizens who are well informed about science. Jon D. Miller pursued this by measuring civic scientific literacy beginning in the early 1980s, and he found consistently that it was dreadfully low. In a parallel development in the United Kingdom in 1985, a program for public understanding of science took the form of a simplistic agenda in which scientists talked and nonscientists listened and then passively internalized what they had been told. This is entirely unrealistic.

At the same time, however, social scientists in the United States observed stakeholder democracy in which the general population may be uninterested and inert about scientific policy, but those who see themselves as being affected by a given policy will take an active interest. Participatory democracy is the label for the activism of nonexperts who take part in making science policy. Case studies include AIDS activists playing constructive roles in clinical trials, or environmental disputes in which nonexperts become important actors, or laypersons serving on advisory committees for the National Institutes of Health. The ideas of stakeholder democracy and participatory democracy are corroborated by observations that show that nonexperts can acquire, understand, and deploy technical information when they have to. Meanwhile, social scientists in the United Kingdom advocate something similar called upstream public engagement in nanotechnology policy. This is meant to be an antidote to the simplistic plan of public understanding of science.

Nanotech is not necessarily more suitable to participatory democracy and upstream engagement than other technologies, but it gained attention among nonexperts at the same time that these discourses matured. And so, by historical coincidence, nanotechnology became a platform for experimenting with mechanisms and processes by which nonexperts have active and constructive roles in the creation of science policy. This is likely to be among the most important forms of social science activity concerning nanotechnology.

The fourth area of interest is a soul-searching debate about the moral value of a program named SEIN, which stands for societal and ethical implications of nanotechnology. This is a priori problematic. Implications usually suggests that when the new technology arrives, it changes the society, and the consequences are understood after the fact. But if one wants to advocate one policy or another before nanotech causes major disruptions, it is necessary to revise the meaning of SEIN. Societal interactions with nanotech suggests that society coevolves with the technology, in which case stakeholders can make decisions about nanotech before technological change becomes a fait accompli.

This leads to an argument about the connection between SEIN and ELSI, the program to study the ethical, legal, and societal implications of the Human Genome Project. ELSI was generally recognized as a successful effort to describe and communicate those topics from the Human Genome Project, but many social scientists felt that ELSI constituted an uncritical acceptance of the agenda of the project. If SEIN is a child of ELSI, does this mean that social scientists are censoring themselves when they receive government funding to do SEIN research?

The U.S. Congress had ELSI in mind as a model when it included a program for SEIN in the Twenty-first Century Nanotechnology Research and Development Act of 2003. For this reason, some social scientists conclude that SEIN is not meant to raise social or ethical questions in government-funded nanotechnology research. They say that SEIN is intended to lubricate popular acceptance of nanotech, that is, to eliminate the social frictions that frustrate technological determinism.

The U.S. governments programmatic documents on nanotechnology often reveal a spirit of technological determinism, but that does not necessarily mean that there is a coherent plan for SEIN that conforms to that spirit. SEIN is very vaguely described in the 2003 act and related documents. Furthermore, the idea of participatory democracy for nanotechthat nonexperts will have active and constructive roles in nanotech policyis at least as credible among those who are doing government-funded SEIN work as the manipulative view of SEIN. One reason is because much of nanotech is meant to lead to tangible consumer products, and it would be a major disaster for industry and government to misread consumer concerns and values, especially after investing billions of dollars to create those products.

This is not to claim that government science bureaucrats have become leftists. The point rather is that the future and the value of SEIN research are far from determined. There is neither documentation nor experience to conclude that SEIN is intrinsically corrupt for social scientists.

Nanotechnology derives from multiple strands of scientific work, some of which are many decades old. It also evokes numerous everyday issues concerning economy, culture, society, and power, and it is strongly shaped by visions of what will happen in the near future. To a social scientist, this is worth noting: a culture whose past, present, and future are interesting and problematic.

SEE ALSO Microelectronics Industry

BIBLIOGRAPHY

Baird, Davis, and Tom Vogt. 2004. Societal and Ethical Interactions with Nanotechnology. Nanotechnology Law and Business 1 (4): 391-396.

Binnig, Gerd, and Heinrich Rohrer. 1987. Scanning Tunneling Microscopy: From Birth to Adolescence. Reviews of Modern Physics pt. 1, 59 (3): 615-625.

Drexler, K. Eric. 1986. Engines of Creation: The Coming Era of Nanotechnology. Garden City, NY: Anchor.

Feynman, Richard P. 1960. Theres Plenty of Room at the Bottom. Engineering and Science 23: 22-36.

Fisher, Erik. 2005. Lessons Learned from the Ethical, Legal, and Social Implications Program (ELSI): Planning Societal Implications Research for the National Nanotechnology Program. Technology in Society 27: 321-328.

Guston, David, and Daniel Sarewitz. 2002. Real-time Technology Assessment. Technology in Society 24: 93-109.

Macnaghten, Phil, Matthew Kearnes, and Brian Wynne. 2005. Nanotechnology, Governance, and Public Deliberation: What Role for the Social Sciences? Science Communication 27 (2): 268-291.

Munn Sanchez, Edward. 2004. The Experts Role in Nanoscience and Technology. In Discovering the Nanoscale, eds. Davis Baird, Alfred Nordmann, and Joachim Schummer, 257-266. Amsterdam: IOS.

Royal Society and Royal Academy of Engineering. 2004. Nanoscience and Nanotechnologies: Opportunities and Uncertainties. London: Royal Society.

Toumey, Chris. 2005. Apostolic Succession: Does Nanotechnology Descend from Richard Feynmans 1959 Talk? Engineering and Science 68 (1): 16-23.

Chris Toumey

Nanotechnology

views updated Jun 11 2018

Nanotechnology

Nanotechnology describes technologies where the component parts can measure just a few atomic diameters (generally around a millionth of a millimeter). The general goal of nanotechnology research programs is to reduce complex and sophisticated machinery into very small operational units.

Nanotechnology involves the development of techniques to build machines from atoms and molecules. The name comes from "nanometer," which is one-billionth of a meter. It involves the development of new electrical devices that depend on quantum effects that arise when the dimension of a structure is only a few atoms across. Because the techniques best suited for fabricating devices on the submicron scale originated in semiconductor processing technology for the production of integrated circuits, nanoscale devices are all based on semiconductors.

The theory of nanotechnology was the brainchild of K. Eric Drexler, who, as a student of genetic engineering at the Massachusetts Institute of Technology in 1971, envisioned the potential for building machines and materials with atoms as basic blocks just as living matter is constructed with DNA. Until the twentieth century, the physics of atoms and molecules were an invisible science. Atoms finally became visible in 1981, when the scanning tunneling microscope was built. The microscope has the ability to scan through the clouds created by electrons around atoms so the individual atom can be seen. The doorway to nanotechnology was opened in 1985 by Richard Smalley, a chemist with Rice University, who found a way to produce carbon in a third form (the two natural forms are graphite and the diamond ) that is crystalline and possesses incredible strength properties. The molecule contains 60 carbon atoms and has a shape much like a soccer ball or a geodesic dome built by the futuristic architect, R. Buckminster Fuller, so the molecule was named the "buckyball" in Fuller's honor.

By 1991, experiments with buckyballs led to long strings of the molecules that are tube- or straw-like and are called buckytubes or nanotubes. Many scientists think nanotubes are a fundamental unit for building countless other nanodevices. Nanotubes can be flexed and woven and are being woven into experimental fibers for use in ultralight, bulletproof vests, as one example. Nanotubes are also perfect conductors, and they may be used to construct atomically precise electronic circuitry for more advanced computers and flat panel displays.


Nanofabrication techniques

Nanoscale devices are made using a combination of different fabrication steps. First is the growth stage, in which layers of different semiconductor material are grown on a substrate, providing structure in one dimension. Second is the lithography/pattern transfer stage, in which a pattern is imposed on a uniform layer, giving structure in the second and third dimensions. Repetition of these two stages results in the production of very complex, three-dimensionally nanoscale semiconductor devices.


Growth stage

In the growth stage, each successive layer has a different composition to impose electrical or optical characteristics on the carriers (electrons and holes). A variety of growth techniques can be used. Molecular beam epitaxy (MBE) and metallo-organic chemical vapor deposition (MOCVD) have proved to be the most useful for nanostructures because they can be used to grow layers of a predetermined thickness to within a few atoms. In MBE, a gun fires a beam of molecules at a substrate upon which the semiconductor crystal is to be grown. As the atoms hit the surface, they adhere and take up positions in the ordered pattern of the crystal, so a near perfect crystal can be grown one atomic layer at a time. The mixture of material in the beam is changed to produce different layers; for example, the introduction of aluminum to the growth of gallium arsenide will result in the production of a layer of aluminum gallium arsenide.


Lithography/pattern transfer stage

Epitaxial growth allows the formation of thin planes of differing materials in one dimension only. The two-stage process of lithography and pattern transfer is used to form structures in the other two dimensions. In the lithography stage (see Figure 1), a film of radiation-sensitive material known as the "resist" is laid on top of the semiconductor layer where the structure is going to be made. A pattern is exposed on the resist using electrons, ions, or photons. The film is altered during the exposure step, thus allowing the resist to be chemically developed as a relief image. This image is then transferred to the semiconductor by doping (adding minute amounts of foreign elements), etching, growing, or lift-off as shown in Figure 1.

The exposure pattern can be written on the resist in a line-by-line manner using an electron or ion beam or it can be imprinted all at once using a mask, much like spray painting a letter on a wall using a template. Exposure by writing the pattern with a beam is especially useful for making prototype structures, because it avoids the expense of making a new mask for each pattern; features as small as a few nanometers can be written this way.

Writing patterns one line at a time is time consuming, however, and expensive when it comes to producing large quantities, so masks are used that expose a number of chips simultaneously. Masks can be used with electron or ion beams or with photons. It is important to note that the wavelength of light for exposing a pattern has to be less than the smallest feature being exposed.

Where nanoscale dimensions are involved, this necessitates the use of vacuum ultraviolet light or x rays . Penetration of the x rays into the semiconductor must be avoided to prevent damage to the crystal, so wavelengths of 1.3 nm or 4.5 nm are preferred because the polymeric resist exhibits an absorption depth of about 1 micron at these wavelengths.

The requirement on the wavelength of the x rays and the obvious need to have the x ray beam directed precisely necessitates the use of a well-controlled x ray source, such as a synchrotron. Laser-based x ray sources are currently being considered as a less expensive alternative to the synchrotron.


Theoretical methods

In order to progress in the fabrication and diagnostics of nanostructures and surfaces, scientists are developing more advanced spectroscopic and imaging methods. Highly charged ion surface interaction using an electron beam ion trap (EBIT) facility is one such method being examined by researchers in the Department of Physics at the University of Nevada and at the Lawrence Livermore National Laboratory. Researchers deposited a large amount of potential energy from single ions at the surface leading to localized surface defects of subnanosize. The comparison of the computer simulation with experimental x-ray spectra showed the formation of subsurface hollow atoms.

Conventional nanoscale devices

In conventional semiconductors, the carrier velocity is limited to about 47 in/sec (107 cm/sec), which results in limitations of current density and turn-on time, or frequency response. Because of the need for faster devices (it takes less time for a carrier to cross a smaller device) and the desire to squeeze more devices onto a single chip, e.g. a processor chip for computers, significant advances have been made since the 1970s in fabricating smaller devices.

Additional advantages have been found for operating devices in the nanoscale regime. The carrier velocity limit of 47 in/sec (107 cm/sec) is set in large-scale semi-conductors by collisions with crystalline defects, photons, etc., and can be characterized by a mean free path between collisions: the longer the mean free path, the fewer collisions and the higher the carrier velocity. It has been found that in nanoscale devices, the device itself can be considerably shorter than the mean free path, in which case a carrier injected at a high velocity, say 48 in/sec (108 cm/sec), never suffers any collisions and, therefore, does not slow down. This phenomenon is termed ballistic transport and allows semiconductors to operate far faster than was possible before, reaching frequencies of 200 Ghz.

There are complexities associated with the manufacturing of conventional nanoscale devices that appear to be major obstacles on the way to producing large quantities of these types of devices. This difficulty in manufacturing, coupled with the fact that ballistic transport devices perform best at low signal levels (as opposed to conventional electronics that operate well at high power levels) suggests that these new devices will not become widely available for use in computer chips in the near future.


Quantum-effect nanoscale devices

According to the laws of quantum mechanics , free carriers in a metal or semiconductor can only take on specific values of energy, as defined by the crystal structure; that is, the energy is quantized. For most practical purposes, there are so many closely spaced energy levels, it appears that the carriers have a continuum of possible energies, except for the well-defined gaps characteristic of semiconductors. When the carrier is confined to a region where one or more of the dimensions reach the range of less than 100 nm, the quantum energy levels begin to spread out and the quantum nature becomes detectable. This reduction in size can take place in one, two, or three dimensions, using the fabrication techniques discussed earlier, yielding structures known respectively as superlattices, quantum wires, and quantum dots.

When electrons are introduced into a semiconductor structure, they migrate to those positions where their energy is lowest, much like a ping-pong ball will come to rest in a dimple on a waffled surface. If the nanostructure is engineered correctly, then the electrons will settle in the nanostructure itself and not in the adjacent layers. These carriers will then exhibit the quantum effects imposed on them by the nanostructure. The ability to engineer artificial atoms and molecules in semiconductors using nanofabrication techniques has resulted in a powerful new tool in creating novel semiconductor devices, such as quantum dots where the number of carriers trapped by the dot can be controlled by an external voltage.

It appears possible that nanoscale quantum-effect devices may become widely used in complex electronic systems, such as a neural array of quantum dots spaced only a few 100 nm apart, but this will only take place after significant progress has been made in fabrication and tolerance.

Scientists see a universe of potential in nanotechnology, following years or perhaps decades of research and development. Some of the applications they foresee are as follows: surgical instruments of molecular scale that are guided by computers of the same size; rocket ships for the individual made of shatterproof materials created by nanomachines; synthesis of foods by molecules and an end to famine; pollution-free manufacturing and molecular devices that clean up existing pollution without human intervention; consumer goods that assemble themselves; reforming of soil (termed terraforming) both on Earth and other planets where soil and rock may not exist; and computers capable of more computations in 1 second than all the existing semiconductor devices in the world combined. Nanodevices may create "smart mat ter" that, when used to build a bridge or a high-rise building, knows when and how to repair itself; diamonds of perfect quality and any size may be built atom by atom to suit industrial needs or an individual's ideal; injectable molecular robots that enter the bloodstream on seek-and-destroy missions for cancer , AIDS , invading bacteria or viruses, and arterial blockages. Similarly, nanoparticles might carry vaccines and drugs directly to the source of the ailment.


Tangible advances

A research team led by Chad Mirkin, Charles E. and Emma H. Morrison, Professor of Chemistry and director of Northwestern's Center for Nanotechnology (Evanston, Illinois) developed a method of writing nanostructures with what they have termed "the world's smallest plotter." The device writes multiple lines of molecules, each line only 15 nm or 30 molecules wide and only 5 nm or about 200 billionths of an inch apart. The plotter is based on the researcher's dip-pen nanolithography (DPN) technique. The dip-pen nanolithography draws tiny lines with a single "ink" or type of molecule. The nano-plotter prints multiple "inks," or four different kinds of molecules, side by side while retaining the chemical purity of each line. DPN, described in the January 29, 1999, issue of Science, turns an atomic force microscope (AFM) into a writing instrument. Researchers first apply an oily "ink" of octadecanethiol (ODT) to the AFM's tip. When the tip is brought into contact with a thin sheet of gold "paper," the ODT molecules are transferred to the gold's surface via a tiny water droplet that forms naturally at the tip. The new nanoplotter multiplies this technique, laying down a series of molecular lines with extreme accuracy . While the microfabrication of electronic circuits and other products currently use solid-state or inorganic materials, innovations such as the nanoplotter will direct future technologies toward the use of organic and even biological materials.

Coupling the organic and inorganic, biological engineers at Cornell University (Ithaca, New York) demonstrated the feasibility of small, self-propelled bionic motors that do their builders' bidding in plant , animal , or human cells. Such machines could travel through the body, functioning as mobile pharmacies dispensing precise doses of chemotherapy drugs exclusively to cancer cells, for example. The device, the result of integrating a living molecular motor with a fabricated device at the "nano" scale, is a few billionths of a meter in size. The first integrated motor, a molecule of the enzyme ATPase coupled to a metallic substrate with a genetically engineered "handle," ran for 40 minutes at 3-4 revolutions per second.

Researchers at Rensselaer Polytechnic Institute (Troy, New York) are working with materials comprising common atoms arranged in grains less than 100 nm in diameter—10,000 times smaller than grains in conventional materials. These grains are used as building blocks to create materials with entirely new properties, which the researchers predict could revolutionize everything from drug delivery to sunscreens.

Althought it is still in the early stage of development, not unlike that of computer and information technology in the 1950s, nanotechnology is a rapidly expanding scientific field.

Defense programs in many countries are now concentrating on nanotechnology research programs that will facilitate advances in programs such as those designed to create secure but small messaging equipment, allow the development of smart weapons, improve stealth capabilities, develop specialized sensors (including bio-inclusive sensors), create self-repairing military equipment, and improve the development and delivery mechanisms for medicines and vaccines.

See also Microtechnology.

Resources

books

Mulhall, Douglas. Our Molecular Future: How Nanotechnology,Robotics, Genetics, and Artificial Intelligence Will Change Our World. Amherst, NY: Prometheus Books, 2002.

Ratner, Mark A., and Daniel Ratner. Nanotechnology: A GentleIntroduction to the Next Big Idea. Upper Saddle River, NJ: Prentice Hall Publishers, 2002.


periodicals

Bennewitz, R., et al. "Atomic Scale Memory at a Silicon Surface." Nanotechnology 13 (2000): 499–502.


organizations

National Science and Technology Council. "National Nanotechnology Initiative" [cited March 10, 2003] <http://www.nano.gov/start.htm>.


Iain A. McIntyre

KEY TERMS

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Ballistic transport

—Movement of a carrier through a semiconductor without collisions, resulting in extraordinary electrical properties.

Carriers

—Charge-carrying particles in semiconductors, electrons, and holes.

Epitaxy

—The growth of crystalline layers of semi-conducting materials in a layered structure.

Integrated circuits

—Complex electronic circuits fabricated using multiple growth and lithography/pattern transfer stages to produce many miniature electronic elements on a monolithic device.

Nanotechnology

views updated May 29 2018

Nanotechnology

K. LEE LERNER

Defense programs in many countries are now concentrating on nanotechnology research that will facilitate advances in such technology used to create secure but small messaging equipment, allow the development of smart weapons, improve stealth capabilities, aid in developing specialized sensors (including bio-inclusive sensors), help to create self-repairing military equipment, and improve the development and delivery mechanisms for medicines and vaccines.

Nanotechnology builds on advances in microelectronics during the last decades of the twentieth century. The miniaturization of electrical components greatly increased the utility and portability of computers, imaging equipment, microphones, and other electronics. Indeed, the production and wide use of such commonplace devices such as personal computers and cell phones was absolutely dependent on advances in microtechnology.

Despite these fundamental advances there remain real physical constraints (e.g., microchip design limitations) to further miniaturization based upon conventional engineering principles. Nanotechnologies intend to revolutionize components and manufacturing techniques to overcome these fundamental limitations. In addition, there are classes of biosensors and feedback control devices that require nanotechnology becausedespite advances in microtechnologypresent components remain too large or slow.

Advances in Nanotechnology

Nanotechnology advances affect all branches of engineering and science that deal directly with device components ranging in size between 1/10,000,000 (one ten millionth of a millimeter) and 1/10,0000 millimeter. At these scales, even the most sophisticated microtechnology-based instrumentation is useless. Engineers anticipate that advances in nanotechnology will allow the direct manipulation of molecules in biological samples (e.g., proteins or nucleic acids) paving the way for the development of new materials that have a biological component or that can provide a biological interface.

In addition to new tools, nanotechnology programs advance practical understanding of quantum physics. The internalization of quantum concepts is a necessary component of nanotechnology research programs because the laws of classical physics (e.g., classical mechanics or generalized gas laws) do not always apply to the atomic and near-atomic level.

Nanotechnology and quantum physics. Quantum theory and mechanics describe the relationship between energy and matter on the atomic and subatomic scale. At the beginning of the twentieth century, German physicist Maxwell Planck (18581947) proposed that atoms absorb or emit electromagnetic radiation in bundles of energy termed quanta. This quantum concept seemed counter-intuitive to well-established Newtonian physics. Advancements associated with quantum mechanics (e.g., the uncertainty principle) also had profound implications with regard to the philosophical scientific arguments regarding the limitations of human knowledge.

Planck's quantum theory, which also asserted that the energy of light (a photon) was directly proportional to its frequency, proved a powerful concept that accounted for a wide range of physical phenomena. Planck's constant relates the energy of a photon with the frequency of light. Along with the constant for the speed of light, Planck's constant (h = 6.626 x 1034 Joule-second) is a fundamental constant of nature.

Prior to Planck's work, electromagnetic radiation (light) was thought to travel in waves with an infinite number of available frequencies and wavelengths. Planck's work focused on attempting to explain the limited spectrum of light emitted by hot objects. Danish physicist Niels Bohr (18851962) studied Planck's quantum theory of radiation and worked in England with physicists J. J. Thomson (18561940), and Ernest Rutherford (18711937) to improve their classical models of the atom by incorporating quantum theory. During this time, Bohr developed his model of atomic structure. According to the Bohr model, when an electron is excited by energy it jumps from its ground state to an excited state (i.e., a higher energy orbital). The excited atom can then emit energy only in certain (quantized) amounts as its electrons jump back to lower energy orbits located closer to the nucleus. This excess energy is emitted in quanta of electromagnetic radiation (photons of light) that have exactly the same energy as the difference in energy between the orbits jumped by the electron.

The electron quantum leaps between orbits proposed by the Bohr model accounted for Plank's observations that atoms emit or absorb electromagnetic radiation in quanta. Bohr's model also explained many important properties of the photoelectric effect described by Albert Einstein (18791955). Einstein assumed that light was transmitted as a stream of particles termed photons. By extending the well-known wave properties of light to include a treatment of light as a stream of photons, Einstein was able to explain the photoelectric effect. Photoelectric properties are key to regulation of many microtechnology and proposed nanotechnology level systems.

Quantum mechanics ultimately replaced electron "orbitals" of earlier atomic models with allowable values for angular momentum (angular velocity multiplied by mass) and depicted electron positions in terms of probability "clouds" and regions.

In the 1920s, the concept of quantization and its application to physical phenomena was further advanced by more mathematically complex models based on the work of the French physicist Louis Victor de Broglie (18921987) and Austrian physicist Erwin Schrödinger (18871961) that depicted the particle and wave nature of electrons. De Broglie showed that the electron was not merely a particle but a waveform. This proposal led Schrödinger to publish his wave equation in 1926. Schrödinger's work described electrons as a "standing wave" surrounding the nucleus, and his system of quantum mechanics is called wave mechanics. German physicist Max Born (18821970) and English physicist P. A. M. Dirac (19021984) made further advances in defining the subatomic particles (principally the electron) as a wave rather than as a particle and in reconciling portions of quantum theory with relativity theory.

Working at about the same time, German physicist Werner Heisenberg (19011976) formulated the first complete and self-consistent theory of quantum mechanics. Matrix mathematics was well established by the 1920s, and Heisenberg applied this powerful tool to quantum mechanics. In 1926, Heisenberg put forward his uncertainty principle which states that two complementary properties of a system, such as position and momentum, can never both be known exactly. This proposition helped cement the dual nature of particles (e.g., light can be described as having both wave and particle characteristics). Electromagnetic radiation (one region of the spectrum that comprises visible light) is now understood to have both particle and wave like properties.

In 1925, Austrian-born physicist Wolfgang Pauli (19001958) published the Pauli exclusion principle states that no two electrons in an atom can simultaneously occupy the same quantum state (i.e., energy state). Pauli's specification of spin (+1/2 or 1/2) on an electron gave the two electrons in any suborbital differing quantum numbers (a system used to describe the quantum state) and made completely understandable the structure of the periodic table in terms of electron configurations (i.e., the energy-related arrangement of electrons in energy shells and suborbitals).

In 1931, American chemist Linus Pauling published a paper that used quantum mechanics to explain how two electrons, from two different atoms, are shared to make a covalent bond between the two atoms. Pauling's work provided the connection needed in order to fully apply the new quantum theory to chemical reactions.

Advances in nanotechnology depend upon an understanding and application of these fundamental quantum principles. At the quantum level the smoothness of classical physics disappears and nanotechnologies are predicated on exploiting this quantum roughness.

Applications

The development of devices that are small, light, self-contained, use little energy and that will replace larger microelectronic equipment is one of the first goals of the anticipated nanotechnology revolution. The second phase will be marked by the introduction of materials not feasible at larger than nanotechnology levels. Given the nature of quantum variance, scientists theorize that single molecule sensors can be developed and that sophisticated memory storage and neural-like networks can be achieved with a very small number of molecules.

Traditional engineering concepts undergo radical transformation at the atomic level. For example, nano-technology motors may drive gears, the cogs of which are composed of the atoms attached to a carbon ring. Nanomotors may themselves be driven by oscillating magnetic fields or high precision oscillating lasers.

Perhaps the greatest promise for nanotechnology lies in potential biotechnology advances. Potential nano-level manipulation of DNA offers the opportunity to radically expand the horizons of genomic medicine and immunology. Tissue-based biosensors may unobtrusively be able to monitor and regulate site-specific medicine delivery or regulate physiological processes. Nanosystems might serve as highly sensitive detectors of toxic substances or used by inspectors to detect traces of biological or chemical weapons.

In electronics and computer science, scientists assert that nanotechnologies will be the next major advance in computing and information processing science. Microelectronic devices rely on recognition and flips in electron gating (e.g. where differential states are ultimately represented by a series of binary numbers ["0" or "1"] that depict voltage states). In contrast, future quantum processing will utilize the identity of quantum states as set forth by quantum numbers. In quantum cryptography systems with the ability to decipher encrypted information will rely on precise knowledge of manipulations used to achieve various atomic states.

Nanoscale devices are constructed using a combination of fabrication steps. In the initial growth stage, layers of semiconductor materials are grown on a dimension limiting substrate. Layer composition can be altered to control electrical and/or optical characteristics. Techniques such as molecular beam epitaxy (MBE) and metallo-organic chemical vapor deposition (MOCVD) are capable of producing layers of a few atoms thickness. The developed pattern is then imposed on successive layers (the pattern transfer stage) to develop desired three dimensional structural characteristics.

Nanotechnology Research

In the United States, expenditures on nanotechnology development tops $500 million per year and is largely coordinated by the National Science Foundation and Department of Defense Advanced Research Projects Agency (DARPA) under the umbrella of the National Nano-technology Initiative. Other institutions with dedicated funding for nanotechnology include the Department of Energy (DOE) and National Institutes of Health (NIH).

Research interests. Current research interests in nano-technology include programs to develop and exploit nanotubes for their ability to provide extremely strong bonds. Nanotubes can be flexed and woven into fibers for use in ultrastrongbut also ultralightbulletproof vests. Nanotubes are also excellent conductors that can be used to develop precise electronic circuitry.

Other interests include the development of nanotechnology-based sensors that allow smarter autonomous weapons capable of a greater range of adaptations enroute to a target; materials that offer stealth characteristics across a broader span of the electromagnetic spectrum; self-repairing structures; and nanotechnology-based weapons to disruptbut not destroyelectrical system infrastructure.

FURTHER READING:

BOOKS:

Mulhall, Douglas. Our Molecular Future: How Nanotechnology, Robotics, Genetics, and Artificial Intelligence Will Change Our World. Amherst, NY: Prometheus Books, 2002.

PERIODICALS:

Bennewitz, R., et. al., "Atomic scale memory at a silicon surface." Nanotechnology 13 (2000): 499502.

ELECTRONIC:

National Science and Technology Council. "National Nano-technology Initiative." <http://www.nano.gov/start.htm> (March 19, 2003).

SEE ALSO

DARPA (Defense Advanced Research Projects Agency)

Nanotechnology

views updated Jun 08 2018

Nanotechnology

Like a swarm of bees, tiny humanmade satellitescalled nanosatellites or picosatellites, depending on their sizemay one day fly in formation to remote destinations throughout the solar system. Upon reaching their targets, they will spread out to investigate the area, perhaps one satellite landing on each of a thousand asteroids, crawling around its surface, and sending data back to scientists waiting on Earth. Another swarm might cover the surface of Mars with an army of explorers, investigating more area in one day than a standard rover could reach in several years. Alternatively, the group might be designed to stay together to accomplish its mission: a cluster of satellites each carrying a tiny mirror could be coordinated to act as one giant telescope mirror, surpassing the Hubble Space Telescope's light-gathering power by a factor of a thousand.

Problems with Large Satellites

Typical satellites deployed in the early twenty-first century weigh more than 1,000 kilograms (2,200 pounds). To qualify as a nanosatellite, the device must weigh less than 20 kilograms (or 44 pounds); a picosatellite less than 1 kilogram (2.2 pounds). Such small nanoor picosatellites could address two of the major problems involved with traditional satellite technology:

  1. Cost. The major expense of deploying a traditional satellite lies in transportation costs. A ride on the shuttle averages $6,000 per pound, so the lighter the better. Tiny satellites could possibly be launched using small rockets or electromagnetic railguns, bypassing the expensive shuttle ride altogether.
  2. Failure due to one faulty system. If the communications system of a traditional satellite fails, or if the satellite is damaged during deployment, the whole mission might be scrapped, at a loss of millions of dollars. But nano-and picosatellites could be designed with distributed functions in mind: Some may be responsible for navigation, some for communication, and some for taking photographs of target sites. Should a problem develop in one of the units, others in the group with the same function would take over. Distributed functions and built-in redundancy would save the mission.

Early Attempts: OPAL

Thanks to the miniaturization of off-the-shelf computer components, satellites the size of a deck of cards have already orbited Earth, performing simple tasks, and sending signals back to interested parties on Earth. These include groups of college students at Stanford University in California, who designed and built a satellite "mothership" called OPAL (Orbiting Picosatellite Automatic Launcher) as part of their master's degree program; a group called Artemis at Santa Clara University in California, who designed three of the picosatellite "daughterships" for the mission; and a group of ham radio operators from Washington, D.C., whose StenSat picosatellite was also included aboard the mothership. The Aerospace Corporation in El Segundo, California, manufactured the final two picosatellites for the mission to test microelectromechanical systems (MEMS) technology.

OPAL was launched onboard a JAWsat launch vehicle on January 26, 2000, from the Vandenberg Air Force Base in California. It consisted of a hexagonal, aluminum mothership 23 centimeters (9 inches) tall, weighing 23 kilograms (51 pounds), and containing the six small daughter satellites described earlier, weighing about 0.45 kilograms (1 pound) each. When it reached its orbiting altitude of 698 kilometers (434 miles) above Earth, the picosatellite daughterships were deployed by a spring-launching device.

Once free of the mothership, the picosatellites went into operating mode. One of the three Artemis satellites began transmitting the group's web site address in morse code, while the other two measured the field strength of lightning strikes. StenSat's transponder sent telemetry signals to ham radio operators around the world. The two satellites from the Aerospace Corporation were tethered together, and communicated with each other and the engineers on Earth using MEMS switches that selected between various experimental radio frequencies for transmission. OPAL was still operating a year after launch.

Micro-and Nanotechnologies

The technology that made OPAL possible is as near as one's laptop computer or personal digital assistant. Computing power that used to require a mainframe computer in a room of its own can now fit into a laptop, thanks to innovative engineers who continually cram more and more memory onto smaller and smaller silicon chips. The student engineers used a Motorola microcontroller with 1 MB of onboard RAM operating at 8.38 MHz as OPAL's central processing unit. It was powered by commercially available solar panels, and backed-up by rechargeable nickel cadmium batteries.

But off-the-shelf components, while sufficient for student projects, will not survive at the cutting edge of nanosatellite technology; other technologies will be necessary to keep the smaller-and-smaller trend going. MEMS are tiny devicesgears, switches, valves, sensors, or other standard mechanical or electrical partsmade out of silicon. The technology arose out of the techniques used by microchip designers: pattern a wafer of pure silicon with the dimensions of the transistors, resistors, logic gates, and connectors required for the chip, etch away the material surrounding the pattern, and one has the beginnings of an electronic circuit. So why not do the same for mechanical systems? Lay out a pattern for a tiny gear on a silicon wafer, etch away the surrounding material, including the material underneath that holds the gear to the wafer and to its axle, and one has a working gear that can mesh with other gears. By making sandwiches of different materials and etching them in a carefully controlled manner, scientists have been able to make gears, valves, pumps, switches, and sensors on a very small scalethe microscale. MEMS technology is often called a "top-down" approach: start with a large wafer of silicon and make microcomponents out of it.

To reach the even smaller nanoscale requires a "bottom-up" approach. Using instruments such as an atomic force microscope that can manipulate individual atoms, engineers can build tiny devices an atom at a time. Or, by understanding how atoms tend to bond together naturally, scientists can create conditions where nanoscale devices "self-assemble" on a patterned surface out of the atoms in a vapor. Such precise control will enable them to build nanostructures 1,000 times smaller than MEMS devices. This level of structural control will be necessary for the next generation of sophisticated nano-and picosatellites currently in the planning stages.

What Is Next?

The National Aeronautics and Space Administration's (NASA) Space Technology 5 (ST5) mission is scheduled to launch three nanosatellites into low orbit in 2003. The ST5 nanosatellites will be small octagons about 43 centimeters (17 inches) in diameter and 20 centimeters (8 inches) highabout the size of a big birthday cake. They will be complete systems in themselves, each having navigation, guidance, propulsion, and communications abilities. In addition, the ST5 nanosatellites will be test platforms for new space technologies. One of these, called A Formation Flying and Communications Instrument is a communications system designed to monitor the positions of small spacecraft relative to each other and the grounda first attempt at making satellites fly in formation. Other technologies to be tested on ST5 include a lithium-ion power system that can store two to four times more energy than current batteries, an external coating that can be tuned to absorb heat when the spacecraft is cold or to emit heat when it is too warm, and a MEMS chip that makes fine attitude adjustments to the spacecraft using 8.5 times less power than 2002 devices.

By 2020 NASA hopes to deploy ANTS to the asteroid belt between Mars and Jupiter. ANTS stands for Autonomous Nano Technology Swarm. Each tiny spacecraft would weigh about 1 kilogram (2.2 pounds) and have its own solar sail to power its flight. After a three-year trip, the swarm would spread out to cover thousands of asteroids. The swarm would have a hierarchy of rulers, messengers, and workers. Each satellite would carry one type of instrumentation to perform a specific function: measure a magnetic field, detect gamma rays, take photographs, or analyze the surface composition of an asteroid. Messengers would relay instructions from the rulers to the workers, and also inform the rulers of important information collected by the workers. The rulers could then decide to reassign some of the workers to explore the more promising areas. In the end, a small number of messengers would return to the space station to deliver the data to scientists; the rest of the swarm would perish in space, having finished their duties. Scientists hope to obtain valuable information about the mineral resources of the asteroid belt, which could be a source for metals and other raw materials needed to build colonies in space.

Future Prospects

Nano-and picosatellites will also be useful in Earth orbit in situations where information from a large area is needed simultaneously. Traditional satellites can only be in one place at a time, but picosatellites can be everywhere, if enough of them are deployed. A swarm of picosatellites equipped with cameras and communications links could gather vital information from a battlefield on Earth, relaying enemy positions and troop counts to generals behind the lines. Or an array of satellites could be launched to gather atmospheric information that could help to predict the formation of hurricanes and tornadoes in time to warn the population. The Earth's entire magnetic field might be captured in one instantaneous "snapshot" by widely scattered swarms of satellites.

Projecting far into the future, perhaps a picosatellite could be made that would travel as far as possible into space, then manufacture a copy of itself before its mechanisms failed. This second generation robot/satellite could then travel as far as it could before making another replica, and so on. By sending out millions of tiny, affordable, self-replicating satellites, humankind's reach might one day extend to the farthest parts of the solar system.

see also Miniaturization (volume 4); Robotic Exploration of Space (volume 2); Robotics Technology (volume 2); Satellites, Types of (volume 1).

Tim Palucka

Bibliography

Booth, Nicholas. Space: The Next 100 Years. New York: Orion Books, 1990.

The Editors of Time-Life Books. Spacefarers. Alexandria, VA: Time-Life Books, 1990.

Internet Resources

Orbiting Picosatellite Automated Launcher. Stanford University.<http://ssdl.stanford.edu/opal/>.

Space Technology 5 (ST5). New Millennium Program.<http://nmp.jpl.nasa.gov/st5>.

nanotechnology

views updated Jun 11 2018

nanotechnology Micromechanics used to develop working devices the size of a few nanometres. US scientists have etched an electric motor from silicon that is smaller than 0.1mm (.0039in) wide. They have also made workable gears with a diameter less than a human hair. Further advances resulted in the manipulation of electrons and individual atoms.

Nanotechnology

views updated May 21 2018

Nanotechnology

See Biotechnology