Skip to main content
Select Source:

Information

Information


The word information is used in three principal senses: (1) the mathematical sense from which arises the theory of digital communication or information theory; (2) the linguistic sense in which it is synonymous with the dissemination of meanings understood by members of a culture; and (3) the formative sense in which information denotes the process of giving shape to some medium or substance.

Kinds of information

Counting-information is mathematical information as defined by American mathematician and engineer Claude Shannon (19162001) in a paper on communication theory written in 1948. It has nothing directly to do with meaning; rather it relates solely to an arbitrary measure based upon the theory of probability.


Meaning-information is information in the colloquial sense of knowledge. It is completely different from Shannon's concept of information; it is interpretation-, language-, and culture-dependent.


Shaping-information denotes information as a noun describing the action of giving form to something. It is the oldest sense of the word, originating in the Latin verb informare, further reflected in current usage in the German informieren and the French informer. In this sense, one can speak of the "information" of a system when one imposes constraints upon its degrees of freedom, for example by giving content and structure to a spreadsheet.

Construed in these three ways, information crosses boundaries between physics, culture, and mind. In its modern, counting-information sense, especially in the realm of information technology, it seems to have taken on a life of its own, as if the process of rendering things digitally had some intrinsic value apart from its use in conveying meaning and enabling people to shape the world. As with any new technologythe telephone, the television, the motor car, the mobile phonethere is a period during which fascination with the technology itself supplants the wisdom that governs its use, but eventually the more important purposes resume their ascendancy, and the technology once again comes to be seen as no more than a tool.

The religious significance of the science of information is best understood in terms of the articulation of meaning and the establishment of a balanced view of the place of information in human life. That process is in full swing as digitization, the Internet, global communication, and the dissolution of historical boundaries reshape how people conceive of themselves and how they decide to live their lives.

If technology is to serve rather than dictate human needs, it is essential that people retain their capacity to think creatively, which is to generate the ideas that give shape to the technology by investing it with significant meanings. Otherwise human needs will increasingly be at the mercy of the agendas of those individuals, corporations, and nation-states that control the technology, and people will be powerless to resist their influence by giving expression to their own objectives. Articulation of worthy religious goals is one contribution that theology can make to the restoration of the balance between creative thought and technological power.

Counting-information

The mathematical concept of counting-information is based upon binary arithmetic, on the ability to distinguish between two states, typically represented as 0 and 1, in an electronic device. One such distinguishable state is called a binary unit or bit. Combinations of these states allow data to be encoded in strings, such as 01110101010, that can be stored in two-state devices and transmitted down communication channels. Electronic circuits that can distinguish between only two states are relatively easy to devise, although higher-state devices are possible. The process of encoding facts about the world in such binary strings is called digitization, although any particular encoding is arbitrary.

A string of n bits can exist in 2n; different states and so can represent 2n different symbols. For example, when n = 3, the string can be 000, 001, 010, 011, 100, 101, 110, or 111. If a particular encoding treats these strings as binary numbers, they represent 0, 1, 2, . . . , 7; another encoding might treat them as a, b, . . . , h. In the early years of computing it was thought that 256 different strings would be sufficient to encode most common letters, numbers, and control codes. The number of bits required to store a given amount of data is therefore usually measured in eight-bit units called bytes because of the number of different states of a single byte (28 = 256). Numbers of bits are counted in powers of 2, so a kilobyte is 210 = 1024 bytes; a megabyte is 1024 kilobytes (1024K); and a gigabyte is 1024 megabytes. Typical hard disks can now store between 20 and 100 gigabytes.

The states of a binary system are typically called 0 and 1, True and False, or Yes and No. The system itself is oblivious to these interpretations of the two possible states of a bit, and it is helpful to distinguish between system states and interpretations of those states, for example using the terminology of counting-, meaning- and shaping-information.


The physics of information

The physics of information has given rise to some remarkable results. Shannon showed that there are limits to the rate at which information can be transmitted down a channel with a particular capacity if it is to retain its integrity. Leo Szilard and Leon Brillouin demonstrated that there are fundamental limits to the rate at which information can be processed at given temperatures. Jacob Bekenstein showed that the amount of information that an object can containthe Bekenstein boundis directly related to its mass. Some, such as Carl Friedrich von Weizsäcker, have attempted to reconstruct all of physics in information-theoretic terms by conceiving of all physical processes as streams of information. Still others have employed information to look for a fundamental link between entropy and thermodynamics.

The ability to transfer information digitally requires data to be encoded in a binary form; the limitations of such transmission are the subject of information theory as first elaborated by Shannon. However, information is not always easily converted to digital form, especially when it arises from continuous analogue processes, when strict conversion into a discrete coded form is not possible. Neither are the processes that arise from and are useful to human beings easily distilled into the pure digital states required by computers. Some of the most difficult problems faced by those who work in information technology concern the accommodation of computer systems to the untidiness of the data and processes that are typical of human life.

The question of the fundamental nature of information is philosophically and physically deep. It is irrelevant whether one counts to base 2 (as in binary systems) or some other base, but the question of what one is measuring cannot be avoided, and touches some of the hardest questions in physics.

The state of a bit cannot be detected without degrading energy and so increasing the net entropy of the universe. This familiar phrase encapsulates the physical truth that one cannot obtain something for nothing. The Scottish physicist James Clerk Maxwell (18311878) once proposed a thought experiment in which a demon capable of detecting the movement of molecules of gas could open and close a trapdoor to allow fast molecules through and keep slow atoms out, thus increasing the temperature of one side of the partition and infringing the second law of thermodynamics. It is now generally accepted that the flaw in this argument arises from the need to increase the entropy of the universe in order to ascertain the state of the molecule; in other words, reading a certain number of bits of information has a thermodynamic cost.


Encoding and encryption

Although encryption is important in the social and political realms affected by information technology, the fundamentals are mathematical and fall within the realm of information theory. The details of modern encryption involve difficult mathematics, but the essential process is not hard to understand. In a simple code or cipher one typically expects to move from an everyday symbol such as 1 or a to a binary string such as 000, to store and manipulate that string in a computer, and then to decode the result by reversing the encoding process. Unfortunately, anyone familiar with the encoding can decode the results, and there are times when one does not wish one's messages to be readperhaps because they contain private commercial information, perhaps because they contain the plans of criminals or terrorists, perhaps because they contain state secrets. So people would like a way to transmit messages in code. But, if the recipient is to decode them, it seems that the decoding rules must also be transmitted, and they could themselves be intercepted, thus compromising the integrity of the message. What is more, it is far harder to know whether an electronic communication has been intercepted than a physical communication such as a book or letter. Instead people need a way to transmit code that does not require the recipient to be told what the encoding process involves. Fortunately, a way to do this has been devised. It is now embodied in the RSA procedure and is as strong or as weak as the number of bits employed in the encryption. This procedure works as follows. Two very large prime numbers that are intrinsically difficult to guess or find (the private key) are used with another number to generate a pair of numbers (the public key) that everyone knows. This process is essentially irreversible in that there is no tractable way to regenerate the original two prime numbers from the public key. This key is then used by anyone who wishes to send me an encoded message to encrypt it, and I, when I receive it, by using my private key, can decode it. Anyone intercepting the encrypted message, even if in possession of the public key, cannot decrypt the message, because they cannot get back to the private key necessary to do so. The strength of the system lies in the size of the public key: a 40-bit number is deemed very difficult to crack; a 128-bit number is deemed almost impossible with current hardware; a 256-bit number could not be decrypted within the lifetime of the universe.


See also Information Technology; Information Theory


Bibliography

leff, harvey s., and rex, andrew f. maxwell's demon: entropy, information, computing. bristol, uk: adam hilger, 1990.

puddefoot, john c. "information and creation." in the science and theology of information: proceedings of the third european conference on science and theology, ed. christoff wassermann, richard kirby, and bernard rordorff. geneva, switzerland: labor et fides, 1992.

puddefoot, john c. "information theory, biology, and christology." in religion and science: history, method, dialogue, eds. w. mark richardson and wesley j. wildman. new york and london: routledge, 1996.

rényi, alfréd. a diary on information theory. new york: wiley, 1984.

shannon, claude e. "a mathematical theory of communication." the bell system technical journal 27 (1948): 379423, 623656.

singh, simon. the code book: the science of secrecy from ancient egypt to quantum cryptography. london: fourth estate, 1999.

john c. puddefoot

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Information." Encyclopedia of Science and Religion. . Encyclopedia.com. 23 Jul. 2017 <http://www.encyclopedia.com>.

"Information." Encyclopedia of Science and Religion. . Encyclopedia.com. (July 23, 2017). http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/information

"Information." Encyclopedia of Science and Religion. . Retrieved July 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/information

information

information Generally, information is whatever is capable of causing a human mind to change its opinion about the current state of the real world. Formally, and especially in science and engineering, information is whatever contributes to a reduction in the uncertainty of the state of a system; in this case, uncertainty is usually expressed in an objectively measurable form. Commonly, this is done by means of Shannon's entropy. Nevertheless, this formula for uncertainty involves probabilities, and these may well have to be subjective. If that is so, the formal measurement must be qualified as depending on subjective probabilities, and “uncertainty” must be replaced by “opinion, or personal estimate, of uncertainty”.

Information must be distinguished from any medium that is capable of carrying it. A physical medium (such as a magnetic disk) may carry a logical medium (data, such as binary or text symbols). The information content of any physical objects, or logical data, cannot be measured or discussed until it is known what range of possibilities existed before and after they were received. The information lies in the reduction in uncertainty resulting from the receipt of the objects or the data, and not in the size or complexity of the objects or data themselves. Questions of the form, function, and semantic import of data are only relevant to information inasmuch as they contribute to the reduction of uncertainty. If an identical memorandum is received twice, it does not convey twice the information that its first occurrence conveyed: the second occurrence conveys no information at all, unless, by prior agreement, the number of occurrences is itself to be regarded as significant.

Information has ramifications in security, politics, culture, and the economy, as well as in science and engineering. The extent to which information is used as an economic commodity is one of the defining characteristics of the “post-industrial” society, hence the phrase “the information society”.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"information." A Dictionary of Computing. . Encyclopedia.com. 23 Jul. 2017 <http://www.encyclopedia.com>.

"information." A Dictionary of Computing. . Encyclopedia.com. (July 23, 2017). http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/information

"information." A Dictionary of Computing. . Retrieved July 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/information

information

in·for·ma·tion / ˌinfərˈmāshən/ • n. 1. facts provided or learned about something or someone: a vital piece of information. ∎ Law a formal criminal charge lodged with a court or magistrate by a prosecutor without the aid of a grand jury: the tenant may lay an information against his landlord. 2. what is conveyed or represented by a particular arrangement or sequence of things: genetically transmitted information. ∎ Comput. data as processed, stored, or transmitted by a computer. ∎  (in information theory) a mathematical quantity expressing the probability of occurrence of a particular sequence of symbols, impulses, etc., as contrasted with that of alternative sequences. DERIVATIVES: in·for·ma·tion·al / -shənl/ adj. in·for·ma·tion·al·ly / -shənl-ē/ adv.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"information." The Oxford Pocket Dictionary of Current English. . Encyclopedia.com. 23 Jul. 2017 <http://www.encyclopedia.com>.

"information." The Oxford Pocket Dictionary of Current English. . Encyclopedia.com. (July 23, 2017). http://www.encyclopedia.com/humanities/dictionaries-thesauruses-pictures-and-press-releases/information

"information." The Oxford Pocket Dictionary of Current English. . Retrieved July 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/humanities/dictionaries-thesauruses-pictures-and-press-releases/information

Information

INFORMATION

The formal accusation of a criminal offense made by a public official; the sworn, written accusation of a crime.

An information is tantamount to an indictment in that it is a sworn written statement which charges that a particular individual has done some criminal act or is guilty of some criminal omission. The distinguishing characteristic between an information and an indictment is that an indictment is presented by a grand jury, whereas an information is presented by a duly authorized public official.

The purpose of an information is to inform the accused of the charge against him, so that the accused will have an opportunity to prepare a defense.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Information." West's Encyclopedia of American Law. . Encyclopedia.com. 23 Jul. 2017 <http://www.encyclopedia.com>.

"Information." West's Encyclopedia of American Law. . Encyclopedia.com. (July 23, 2017). http://www.encyclopedia.com/law/encyclopedias-almanacs-transcripts-and-maps/information

"Information." West's Encyclopedia of American Law. . Retrieved July 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/law/encyclopedias-almanacs-transcripts-and-maps/information

information

information, in law: see indictment.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"information." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. 23 Jul. 2017 <http://www.encyclopedia.com>.

"information." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. (July 23, 2017). http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/information

"information." The Columbia Encyclopedia, 6th ed.. . Retrieved July 23, 2017 from Encyclopedia.com: http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/information