Skip to main content

computing and information technology

computing and information technology. Computing science developed as an aid to mathematics. John Napier, the discoverer of logarithms, described in 1617 a calculating machine made of ivory rods (‘Napier's bones’) which, he explained, would have helped the laborious calculations that ‘ought to have been accomplished by many computers’. Pascal in France developed an arithmetical machine in the 1640s, which Leibniz improved upon, and Sir Samuel Morland, once Pepys's tutor at Magdalene College, Cambridge, invented not only a speaking trumpet, writing-machine, and water-pump, but a calculating device: ‘very pretty, but not very useful’ was Pepys's laconic verdict. Charles Babbage is rightly regarded as the immediate pioneer of computers, but the limitations of his work should not be ignored. His ‘difference engine’ was never completed (in The Decline of Science (1830) he blamed this on inadequate government funding—a cry that has been heard since) and his machine was purely mechanical, whereas modern computers depended upon electronic developments. But his later ‘analytical engine’, which Babbage admitted was no more than a ‘shadowy vision’, would have been not merely a calculator, using punched cards, but would have included a storage facility and a print-out. The Scheutze father and son in Sweden produced a commercially viable tabulating machine in the 1850s, one of which was used by the British registrar-general. In America, Hollerith, working on census returns, developed a machine using electric current and capable of analysing returns at speed. His Tabulating Machine Company (1896) was the forerunner of International Business Machines (IBM), which dominated the computer market for many years.

By the end of the 19th cent. the use of computers for purely statistical purposes was well established. Their introduction to a wider public came essentially after the Second World War. At Bletchley Park, a high-powered British team succeeded in cracking the German Enigma code, with the assistance of Colossus, specially designed to analyse German messages at speed and identify correlations. Early mainframe computers were enormous monsters. IBM's Automatic Sequence Controlled Calculator (ASCC, 1944) was 51 feet long and weighed 5 tons. Not until the invention of the microchip (integrated circuit), which evolved in the late 1950s out of work on semiconductors and transistors, did the possibility of personal computers arrive, changing access to computing out of all recognition and liberating users from the domination of experts. Digital Equipment Corporation introduced its PDP-8 microcomputer in 1963 and was followed by a host of competitors. Even so, the first acquaintance of ordinary people with the new technology was likely to have been with pocket calculators (‘ready reckoners’), which led to a debate on whether they should be used in schools and universities and what effect they would have on mental arithmetic. But by the 1980s computers, long established in offices, turned up in stores, municipal administration, and libraries, and increasingly in spare bedrooms and basements. Babbage had been convinced that his work demonstrated the argument from design and that the world operated as a great calculating-machine, programmed by God. But, as with many discoveries, the excessive claims of pioneers and salesmen have become more sober. Computer manuals discuss chaos theory, expensive Ariane rockets blow up on launch (1996), and it is worth remembering that Babbage lost a great deal of money trying to invent an infallible scheme for winning on horses.

J. A. Cannon

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"computing and information technology." The Oxford Companion to British History. . Encyclopedia.com. 14 Dec. 2018 <https://www.encyclopedia.com>.

"computing and information technology." The Oxford Companion to British History. . Encyclopedia.com. (December 14, 2018). https://www.encyclopedia.com/history/encyclopedias-almanacs-transcripts-and-maps/computing-and-information-technology

"computing and information technology." The Oxford Companion to British History. . Retrieved December 14, 2018 from Encyclopedia.com: https://www.encyclopedia.com/history/encyclopedias-almanacs-transcripts-and-maps/computing-and-information-technology

Learn more about citation styles

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

http://www.mla.org/style

The Chicago Manual of Style

http://www.chicagomanualofstyle.org/tools_citationguide.html

American Psychological Association

http://apastyle.apa.org/

Notes:
  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.