Computer, Analog

views updated Jun 11 2018

Computer, Analog

Resources

A digital computer employs physical device-states as symbols, but an analog computer employs them as models. That is, in a digital computer the on-off states of transistor devices are used to stand for 0s and 1s, which are made stand for phenomena of interest, such as words, data, pixels, or the like; in an analog computer, the continuously variable states of various electronic devices are made to behave like some physical system of interest. A rough parallel would be using pencil-and-paper mathematics to determine the carrying capacity of an arch design (symbolic computing) versus building a scale model out of balsawood and seeing how much weight it will bear (analog computing).

A more precise definition of analog computing is as follows: An analog computer models the behaviors of smoothly varying mathematical variablesusually representing physical phenomena such as temperatures, pressures, or velocitiesby translating these variables into (usually) voltages or gear movements. It then manipulates these physical quantities so as to solve the equations describing the original phenomena. Thermometers and scales can be viewed as rudimentary analog computers: they translate an unwieldy physical phenomenon (i.e., a patients temperature or weight) into a manageable physical model or analog (i.e., the volume of a fixed quantity of colored alcohol or the displacement of a spring) that has been designed to vary linearly with the phenomenon to be measured; the device then derives a numerical result from the model (i.e., by aligning alcohol level or a pointer with a printed scale). Another example: pouring three equal units of water into an empty vertical tube and measuring the height of the result would be an analog method of computing that x + x + x = 3 x.

The earliest known analog computer is the astrolabe. First built in Greece during the first century BC, this device used pointers and scales on its face and a complex arrangement of bronze gears to predict the motions of the sun, planets, and stars.

Other early measuring devices were also analog computers. Sundials, for example, traced a shadows path to show the time of day. Spring-operated weight scales, which have been used for centuries, convert the pull on a stretched spring to numerical units of weight. The slide rule was invented about 1620 and was used until superseded by the electronic calculator in the late twentieth century.

In 1905, Rollin Harris (18631918) and E. G. Fisher (18521939) of the United States Coast and Geodetic Survey started work on a calculating device that would forecast tides. It was not the first such device, but was the most complex to ever be built. Dubbed the Great Brass Brain, it was 11 ft (3.35 m) long and 7 ft (2.1 m) high, weighed 2, 500 lb (1135 kg), and contained a maze of cams, gears, and rotating shafts. Completed in 1910, the machine worked as follows: an operator set 37 dials (each representing a particular geological or astronomical variable) and turned a crank. The computer then drew up tidal charts for as far into the future as the operator wished. It made accurate predictions and was used for 56 years before being retired in 1966.

Vannevar Bush (18901974), an electrical engineer at the Massachusetts Institute of Technology, created what is considered to be the first modern analog computer in the 1930s. Bush, with a team from MITs electrical engineering staff, discouraged by the time-consuming mathematical computations of differential equations that were required to solve certain engineering problems, began work on a device to solve these equations automatically. The first version of their device, dubbed the differential analyzer, was unveiled in 1930, the second in 1935. The latter weighed 100 tons, contained 150 motors, and hundreds of miles of wires connecting relays and vacuum tubes; instructions could be fed to the machine using hole-punched paper tape. Three copies of the machine were built for military and research use. Over the next 15 years, MIT built several new versions of the computer. By present standards the machine was slow, only about 100 times faster than a human operator using a desk calculator. Like most analog computers since, the MIT machines modeled phenomena using voltages, and contained a number of standard voltage-manipulating modulesintegrators, differentiators, adders, inverters, and so forth whose connections could be reconfigured to model, within limits, any desired equation.

In the 1950s, RCA produced the first reliable design for a fully electronic analog computer, but by this time, many of the most complex functions of analog computers were being assumed by faster and more accurate digital computers. Analog computers are still used today for specialized applications in scientific calculation, engineeringdesign, industrial process control, and spacecraft navigation. Neural networks are an active research subfield in analog computing. Research in the late 1990s suggested that, in theory, an ideal analog computer might be able to solve certain problems beyond the reach even of an ideal digital computer with unlimited processing power. However, it is likely that digital computers, due to their great flexibility, will continue to dominant computing for the foreseeable future.

Resources

OTHER

Ulmann, Bernd. Analog and Hybrid Computing. May, 2006. <http://fafner.dyndns.org/vaxman/publications/anhyb.pdf> (accessed October 23, 2006).

Computer, Analog

views updated May 29 2018

Computer, analog

A digital computer employs physical device states as symbols; an analog computer employs them as models. An analog computer models the behaviors of smoothly varying mathematical variables—usually representing physical phenomena such as temperatures, pressures, or velocities—by translating these variables into (usually) voltages or gear movements. It then manipulates these physical quantities so as to solve the equations describing the original phenomena. Thermometers and scales can be viewed as rudimentary analog computers: they translate an unwieldy physical phenomenon (i.e., a patient's temperature or weight) into a manageable physical model or analog (i.e., the volume of a fixed quantity of colored alcohol or the displacement of a spring) that has been designed to vary linearly with the phenomenon to be measured; the device then derives a numerical result from the model (i.e., by aligning alcohol level or a pointer with a printed scale). Another example: pouring three equal units of water into an empty vertical tube and measuring the height of the result would be an analog method of computing that x + x + x = 3x.

The earliest known analog computer is the astrolabe . First built in Greece during the first century b.c., this device used pointers and scales on its face and a complex arrangement of bronze gears to predict the motions of the Sun , planets, and stars.

Other early measuring devices were also analog computers. Sundials, for example, traced a shadow's path to show the time of day. Springweight scales, which have been used for centuries, convert the pull on a stretched spring to units of weight. The slide rule was invented about 1620 and was used until superseded by the electronic calculator in the late twentieth century.

In 1905, Rollin Harris (1863–1918) and E. G. Fisher (1852–1939) of the United States Coast and Geodetic Survey started work on a calculating device that would forecast tides . It was not the first such device, but was the most complex to be built. Dubbed the Great Brass Brain , it was 11 ft (3.35 m) long and 7 ft (2.1 m) high, weighed 2,500 lb (1135 kg), and contained a maze of cams, gears, and rotating shafts. Completed in 1910, the machine worked as follows: an operator set 37 dials (each representing a particular geological or astronomical variable) and turned a crank. The computer then drew up tidal charts for as far into the future as the operator wished. It made accurate predictions and was used for 56 years before being retired in 1966.

Vannevar Bush (1890–1974), an electrical engineer at the Massachusetts Institute of Technology, created what is considered to be the first modern analog computer in the 1930s. Bush, with a team from MIT's electrical engineering staff, discouraged by the time-consuming mathematical computations of differential equations that were required to solve certain engineering problems, began work on a device to solve these equations automatically. The first version of their device, dubbed the differential analyzer, was unveiled in 1930; the second in 1935. The latter weighed 100 tons, contained 150 motors, and hundreds of miles of wires connecting relays and vacuum tubes; instructions could be fed to the machine using hole-punched paper tape. Three copies of the machine were built for military and research use. Over the next 15 years, MIT built several new versions of the computer. By present standards the machine was slow, only about 100 times faster than a human operator using a desk calculator. Like most analog computers since, the MIT machines modeled phenomena using voltages, and contained a number of standard voltage-manipulating modules—integrators, differentiators, adders, inverters, and so forth—whose connections could be reconfigured to model, within limits, any desired equation.

In the 1950s, RCA produced the first reliable design for a fully electronic analog computer, but by this time, many of the most complex functions of analog computers were being assumed by faster and more accurate digital computers. Analog computers are still used today for specialized applications in scientific calculation, engineering design, industrial process control, and spacecraft navigation. Neural networks are an active research sub-field in analog computing. Recent research suggests that, in theory, an ideal analog computer might be able to solve certain problems beyond the reach even of an ideal digital computer with unlimited processing power.

Resources

other

Bains, Sunny. "Analog Computer Trumps Turing Model." EE Times. November 3, 1998 [cited January 6, 2003]. <http://www.eetimes.com/story/OEG19981103S0017>.

Computer, Analog

views updated May 18 2018

Computer, analog

A digital computer performs calculations based solely upon numbers or symbols. An analog computer, on the other hand, translates continuously changing quantities (such as temperature, pressure, weight, or speed) into corresponding voltages or gear movements. It then performs "calculations" by comparing, adding, or subtracting voltages or gear motions in various ways. The final result is sent to an output device such as a cathode-ray tube or pen plotter on a roll of paper. Common devices such as thermostats and bathroom scales are actually simple analog computers: they "compute" one thing by measuring another. They do not count.

Early analog computers

The earliest known analog computer is an astrolabe. First built in Greece around the second century b.c., the device uses gears and scales to predict the motions of the Sun, planets, and stars. Other early measuring devices are also analog computers. Sundials trace a shadow's path to show the time of day. The slide rule (a device used for calculation that consists of two rules with scaled numbers) was invented about 1620 and is still used, although it has been almost completely replaced by the electronic calculator.

Modern analog computers

Vannevar Bush, an electrical engineer at the Massachusetts Institute of Technology (MIT), created in the 1930s what is considered to be the first modern computer. He and a team from MIT's electrical engineering staff, discouraged by the time-consuming mathematical computations required to solve certain engineering problems, began work on a device to solve these equations automatically. In 1935, they unveiled the second version of their device, dubbed the "differential analyzer." It weighed 100 tons and contained 150 motors and hundreds of miles of wires connecting relays and vacuum tubes. By present standards the machine was slow, only about 100 times faster than a human operator using a desk calculator.

In the 1950s, RCA produced the first reliable design for a fully electronic analog computer. By this time, however, many of the most complex functions of analog computers were being assumed by faster and more accurate digital computers. Analog computers are still used today for some applications, such as scientific calculation, engineering design, industrial process control, and spacecraft navigation.

[See also Computer, digital ]

analog computer

views updated May 17 2018

analog computer A computer that performs computations (such as summation, multiplication, integration, and other operations) by manipulating continuous physical variables that are analogs of the quantities being subjected to computation. The most commonly used physical variables are voltage and time. Some analog computers use mechanical components: the physical variables become, for example, angular rotations and linear displacements. See also discrete and continuous systems.