Skip to main content

Supercomputers

Supercomputers

Supercomputers, the world's largest and fastest computers, are primarily used for complex scientific calculations. The parts of a supercomputer are comparable to those of a desktop computer: they both contain hard drives, memory, and processors (circuits that process instructions within a computer program).

Although both desktop computers and supercomputers are equipped with similar processors, their speed and memory sizes are significantly different. For instance, a desktop computer built in the year 2000 normally has a hard disk data capacity of between 2 and 20 gigabytes and one processor with tens of megabytes of random access memory (RAM)just enough to perform tasks such as word processing, web browsing, and video gaming. Meanwhile, a supercomputer of the same time period has thousands of processors, hundreds of gigabytes of RAM, and hard drives that allow for hundreds, and sometimes thousands, of gigabytes of storage space.

The supercomputer's large number of processors, enormous disk storage, and substantial memory greatly increase the power and speed of the machine. Although desktop computers can perform millions of floating-point operations per second (megaflops), supercomputers can perform at speeds of billions of operations per second (gigaflops) and trillions of operations per second (teraflops).

Evolution of Supercomputers

Many current desktop computers are actually faster than the first supercomputer, the Cray-1, which was developed by Cray Research in the mid-1970s. The Cray-1 was capable of computing at 167 megaflops by using a form of supercomputing called vector processing , which consists of rapid execution of instructions in a pipelined fashion. Contemporary vector processing supercomputers are much faster than the Cray-1, but an ultimately faster method of supercomputing was introduced in the mid-1980s: parallel processing . Applications that use parallel processing are able to solve computational problems by simultaneously using multiple processors.

Using the following scenario as a comparative example, it is easy to see why parallel processing is becoming the preferred supercomputing method. If you were preparing ice cream sundaes for yourself and nine friends, you would need ten bowls, ten scoops of ice cream, ten drizzles of chocolate syrup, and ten cherries. Working alone, you would take ten bowls from the cupboard and line them up on the counter. Then, you would place one scoop of ice cream in each bowl, drizzle syrup on each scoop, and place a cherry on top of each dessert. This method of preparing sundaes would be comparable to vector processing. To get the job done more quickly, you could have some friends help you in a parallel processing method. If two people prepared the sundaes, the process would be twice as fast; with five it would be five times as fast; and so on.

Conversely, assume that five people will not fit in your small kitchen, therefore it would be easier to use vector processing and prepare all ten sundaes yourself. This same analogy holds true with supercomputing. Some researchers prefer vector computing because their calculations cannot be readily distributed among the many processors on parallel supercomputers. But, if a researcher needs a supercomputer that calculates trillions of operations per second, parallel processors are preferredeven though programming for the parallel supercomputer is usually more complex.

Applications of Supercomputers

Supercomputers are so powerful that they can provide researchers with insight into phenomena that are too small, too big, too fast, or too slow to observe in laboratories. For example, astrophysicists use supercomputers as "time machines" to explore the past and the future of our universe. A supercomputer simulation was created in 2000 that depicted the collision of two galaxies: our own Milky Way and Andromeda. Although this collision is not expected to happen for another three billion years, the simulation allowed scientists to run the experiment and see the results now. This particular simulation was performed on Blue Horizon, a parallel supercomputer at the San Diego Supercomputer Center. Using 256 of Blue Horizon's 1,152 processors, the simulation demonstrated what will happen to millions of stars when these two galaxies collide. This would have been impossible to do in a laboratory.

Another example of supercomputers at work is molecular dynamics (the way molecules interact with each other). Supercomputer simulations allow scientists to dock two molecules together to study their interaction. Researchers can determine the shape of a molecule's surface and generate an atom-by-atom picture of the molecular geometry. Molecular characterization at this level is extremely difficult, if not impossible, to perform in a laboratory environment. However, supercomputers allow scientists to simulate such behavior easily.

Supercomputers of the Future

Research centers are constantly delving into new applications like data mining to explore additional uses of supercomputing. Data mining is a class of applications that look for hidden patterns in a group of data, allowing scientists to discover previously unknown relationships among the data. For instance, the Protein Data Bank at the San Diego Supercomputer Center is a collection of scientific data that provides scientists around the world with a greater understanding of biological systems. Over the years, the Protein Data Bank has developed into a web-based international repository for three-dimensional molecular structure data that contains detailed information on the atomic structure of complex molecules. The three-dimensional structures of proteins and other molecules contained in the Protein Data Bank and supercomputer analyses of the data provide researchers with new insights on the causes, effects, and treatment of many diseases.

Other modern supercomputing applications involve the advancement of brain research. Researchers are beginning to use supercomputers to provide them with a better understanding of the relationship between the structure and function of the brain, and how the brain itself works. Specifically, neuroscientists use supercomputers to look at the dynamic and physiological structures of the brain. Scientists are also working toward development of three-dimensional simulation programs that will allow them to conduct research on areas such as memory processing and cognitive recognition.

In addition to new applications, the future of supercomputing includes the assembly of the next generation of computational research infrastructure and the introduction of new supercomputing architectures. Parallel supercomputers have many processors, distributed and shared memory, and many communications parts; we have yet to explore all of the ways in which they can be assembled. Supercomputing applications and capabilities will continue to develop as institutions around the world share their discoveries and researchers become more proficient at parallel processing.

see also Animation; Parallel Processing; Simulation.

Sid Karin and Kimberly Mann Bruch

Bibliography

Jortberg, Charles A. The Supercomputers. Minneapolis, MN: Abdo and Daughters Pub., 1997.

Karin, Sid, and Norris Parker Smith. The Supercomputer Era. Orlando, FL: Harcourt Brace Jovanovich, 1987.

Internet Resources

Dongarra, Jack, Hans Meuer, and Erich Strohmaier. Top 500 Supercomputer Sites. University of Mannheim (Germany) and University of Tennessee. <http://www.top500.org/>

San Diego Supercomputer Center. SDSC Science Discovery. <http://www.sdsc.edu/discovery/>

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Supercomputers." Computer Sciences. . Encyclopedia.com. 23 Apr. 2017 <http://www.encyclopedia.com>.

"Supercomputers." Computer Sciences. . Encyclopedia.com. (April 23, 2017). http://www.encyclopedia.com/computing/news-wires-white-papers-and-books/supercomputers

"Supercomputers." Computer Sciences. . Retrieved April 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/computing/news-wires-white-papers-and-books/supercomputers

Supercomputers

Supercomputers

BRIAN HOYLE

A supercomputer is a powerful computer that possesses the capacity to store and process far more information than is possible using a conventional personal computer.

An illustrative comparison can be made between the hard drive capacity of a personal computer and a super-computer. Hard drive capacity is measured in terms of gigabytes. A gigabyte is one billion bytes. A byte is a unit of data that is eight binary digits (i.e., 0's and 1's) long; this is enough data to represent a number, letter, or a typographic symbol. Premium personal computers have a hard drive that is capable of storing on the order of 30 gigabytes of information. In contrast, a supercomputer has a capacity of 200 to 300 gigabytes or more.

Another useful comparison between supercomputers and personal computers is in the number of processors in each machine. A processor is the circuitry responsible for handling the instructions that drive a computer. Personal computers have a single processor. The largest supercomputers have thousands of processors.

This enormous computation power makes supercomputers capable of handling large amounts of data and processing information extremely quickly. For example, in April 2002, a Japanese supercomputer that contains 5,104 processors established a calculation speed record of 35,600 gigaflops (a gigaflop is one billion mathematical calculations per second). This exceeded the old record that was held by the ASCI White-Pacific supercomputer located at the Lawrence Livermore National Laboratory in Berkeley, California. The Livermore supercomputer, which is equipped with over 7,000 processors, achieves 7,226 gigaflops.

These speeds are a far cry from the first successful supercomputer, the Sage System CDC 6600, which was designed by Seymour Cray (founder of the Cray Corporation) in 1964. His computer had a speed of 9 megaflops, thousands of times slower than the present day versions. Still, at that time, the CDC 6600 was an impressive advance in computer technology.

Beginning around 1995, another approach to designing supercomputers appeared. In grid computing, thousands of individual computers are networked together, even via the Internet. The combined computational power can exceed that of the all-in-one supercomputer at far less cost. In the grid approach, a problem can be broken down into components, and the components can be parceled out to the various computers. As the component problems are solved, the solutions are pieced back together mathematically to generate the overall solution.

The phenomenally fast calculation speeds of the present day supercomputers essentially corresponds to "real time," meaning an event can be monitored or analyzed as it occurs. For example, a detailed weather map, which would take a personal computer several days to compile, can be complied on a supercomputer in just a few minutes.

Supercomputers like the Japanese version are built to model events such as climate change, global warming, and earthquake patterns. Increasingly, however, supercomputers are being used for security purposes such as the analysis of electronic transmissions (i.e., email, faxes, telephone calls) for codes. For example, a network of supercomputers and satellites that is called Echelon is used to monitor electronic communications in the United States, Canada, United Kingdom, Australia, and New Zealand. The stated purpose of Echelon is to combat terrorism and organized crime activities.

The next generation of supercomputers is under development. Three particularly promising technologies are being explored. The first of these is optical computing. Light is used instead of using electrons to carry information. Light moves much faster than an electron can, therefore the speed of transmission is greater.

The second technology is known as DNA computing. Here, recombining DNA in different sequences does calculations. The sequence(s) that are favored and persist represent the optimal solution. Solutions to problems can be deduced even before the problem has actually appeared.

The third technology is called quantum computing. Properties of atoms or nuclei, designated as quantum bits, or qubits, would be the computer's processor and memory. A quantum computer would be capable of doing a computation by working on many aspects of the problem at the same time, on many different numbers at once, then using these partial results to arrive at a single answer. For example, deciphering the correct code from a 400-digit number would take a supercomputer millions of years. However, a quantum computer that is about the size of a teacup could do the job in about a year.

FURTHER READING:

BOOKS:

Stork, David G. (ed) and Arthur C. Clarke. HAL's Legacy: 2001's Computer Dream and Reality. Boston: MIT Press, 1998.

ELECTRONIC:

Cray Corporation. "What Is a Supercomputer?" Supercomputing. 2002. <http://www.cray.com/supercomputing>(15 December 2002).

The History of Computing Foundation. "Introduction to Supercomputers." Supercomputers. October 13, 2002. <http://www.thocp.net/hardware/supercomputers.htm>(15 December 2002).

SEE ALSO

Computer Hardware Security
Information Warfare

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Supercomputers." Encyclopedia of Espionage, Intelligence, and Security. . Encyclopedia.com. 23 Apr. 2017 <http://www.encyclopedia.com>.

"Supercomputers." Encyclopedia of Espionage, Intelligence, and Security. . Encyclopedia.com. (April 23, 2017). http://www.encyclopedia.com/politics/encyclopedias-almanacs-transcripts-and-maps/supercomputers

"Supercomputers." Encyclopedia of Espionage, Intelligence, and Security. . Retrieved April 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/politics/encyclopedias-almanacs-transcripts-and-maps/supercomputers

supercomputer

supercomputer, a state-of-the-art, extremely powerful computer capable of manipulating massive amounts of data in a relatively short time. Supercomputers are very expensive and are employed for specialized scientific and engineering applications that must handle very large databases or do a great amount of computation, among them meteorology, animated graphics, fluid dynamic calculations, nuclear energy research and weapon simulation, and petroleum exploration. There are two approaches to the design of supercomputers. One, called massively parallel processing (MPP), is to chain together thousands of commercially available microprocessors utilizing parallel processing techniques. A variant of this, called a Beowulf cluster, or cluster computing, employs large numbers of personal computers interconnected by a local area network and running programs written for parallel processing. The other approach, called vector processing, is to develop specialized hardware to solve complex calculations. This technique was employed (2002) in the Earth Simulator, a Japanese supercomputer with 640 nodes composed of 5104 specialized processors to execute 35.6 trillion mathematical operations per second; it is used to analyze earthquake and weather patterns and climate change, including global warming. Operating systems for supercomputers, formerly largely Unix-based, are now typically Linux-based.

Advances in supercomputing have regularly resulted in new supercomputers that significantly exceed the capabilities of those that are only a year older; by 2012 the fastest supercomputer was more than 250,000 times faster than the fastest in 1993 in terms of the number of calculations per second it could complete. Although calculation speed is the standard for measuring supercomputer power, it is not, however, an accurate indicator of everyday performance; most supercomputers are not fully utilized when running programs. Supercomputers can require significant amounts of electrical power, and many use water and refrigeration for cooling, but some are air-cooled and use no more power than the average home. In 2003 scientists at Virginia Tech assembled a relatively low-cost supercomputer using 1,100 dual-processor Apple Macintoshes; it was ranked at the time as the third fastest machine in the world.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"supercomputer." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. 23 Apr. 2017 <http://www.encyclopedia.com>.

"supercomputer." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. (April 23, 2017). http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/supercomputer

"supercomputer." The Columbia Encyclopedia, 6th ed.. . Retrieved April 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/supercomputer

supercomputer

supercomputer A class of very powerful computers that have extremely fast processors, currently capable (2004) of performing several Tflops, i.e. 1012 floating-point operations per second (see flops); most are now multiprocessor systems (see also SMP, MPP). Large main-memory capacity and long word lengths are the other main characteristics. upercomputers are used, for example, in meteorology, engineering, nuclear physics, and astronomy. Several hundred are in operation worldwide at present. Principal manufacturers are Cray Research and NEC, Fujitsu, and Hitachi of Japan.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"supercomputer." A Dictionary of Computing. . Encyclopedia.com. 23 Apr. 2017 <http://www.encyclopedia.com>.

"supercomputer." A Dictionary of Computing. . Encyclopedia.com. (April 23, 2017). http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/supercomputer

"supercomputer." A Dictionary of Computing. . Retrieved April 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/supercomputer

supercomputer

su·per·com·put·er / ˈsoōpərkəmˌpyoōtər/ • n. a particularly powerful mainframe computer. DERIVATIVES: su·per·com·put·ing / -ˌpyoōting/ n.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"supercomputer." The Oxford Pocket Dictionary of Current English. . Encyclopedia.com. 23 Apr. 2017 <http://www.encyclopedia.com>.

"supercomputer." The Oxford Pocket Dictionary of Current English. . Encyclopedia.com. (April 23, 2017). http://www.encyclopedia.com/humanities/dictionaries-thesauruses-pictures-and-press-releases/supercomputer

"supercomputer." The Oxford Pocket Dictionary of Current English. . Retrieved April 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/humanities/dictionaries-thesauruses-pictures-and-press-releases/supercomputer