computing and information technology

views updated

computing and information technology. Computing science developed as an aid to mathematics. John Napier, the discoverer of logarithms, described in 1617 a calculating machine made of ivory rods (‘Napier's bones’) which, he explained, would have helped the laborious calculations that ‘ought to have been accomplished by many computers’. Pascal in France developed an arithmetical machine in the 1640s, which Leibniz improved upon, and Sir Samuel Morland, once Pepys's tutor at Magdalene College, Cambridge, invented not only a speaking trumpet, writing-machine, and water-pump, but a calculating device: ‘very pretty, but not very useful’ was Pepys's laconic verdict. Charles Babbage is rightly regarded as the immediate pioneer of computers, but the limitations of his work should not be ignored. His ‘difference engine’ was never completed (in The Decline of Science (1830) he blamed this on inadequate government funding—a cry that has been heard since) and his machine was purely mechanical, whereas modern computers depended upon electronic developments. But his later ‘analytical engine’, which Babbage admitted was no more than a ‘shadowy vision’, would have been not merely a calculator, using punched cards, but would have included a storage facility and a print-out. The Scheutze father and son in Sweden produced a commercially viable tabulating machine in the 1850s, one of which was used by the British registrar-general. In America, Hollerith, working on census returns, developed a machine using electric current and capable of analysing returns at speed. His Tabulating Machine Company (1896) was the forerunner of International Business Machines (IBM), which dominated the computer market for many years.

By the end of the 19th cent. the use of computers for purely statistical purposes was well established. Their introduction to a wider public came essentially after the Second World War. At Bletchley Park, a high-powered British team succeeded in cracking the German Enigma code, with the assistance of Colossus, specially designed to analyse German messages at speed and identify correlations. Early mainframe computers were enormous monsters. IBM's Automatic Sequence Controlled Calculator (ASCC, 1944) was 51 feet long and weighed 5 tons. Not until the invention of the microchip (integrated circuit), which evolved in the late 1950s out of work on semiconductors and transistors, did the possibility of personal computers arrive, changing access to computing out of all recognition and liberating users from the domination of experts. Digital Equipment Corporation introduced its PDP-8 microcomputer in 1963 and was followed by a host of competitors. Even so, the first acquaintance of ordinary people with the new technology was likely to have been with pocket calculators (‘ready reckoners’), which led to a debate on whether they should be used in schools and universities and what effect they would have on mental arithmetic. But by the 1980s computers, long established in offices, turned up in stores, municipal administration, and libraries, and increasingly in spare bedrooms and basements. Babbage had been convinced that his work demonstrated the argument from design and that the world operated as a great calculating-machine, programmed by God. But, as with many discoveries, the excessive claims of pioneers and salesmen have become more sober. Computer manuals discuss chaos theory, expensive Ariane rockets blow up on launch (1996), and it is worth remembering that Babbage lost a great deal of money trying to invent an infallible scheme for winning on horses.

J. A. Cannon

About this article

computing and information technology

Updated About encyclopedia.com content Print Article