views updated May 11 2018


Millions of people around the world live surrounded by information and information technologies. They expect to hear the news on the radio, enjoy entertainment programming on the television, purchase any book in print, and find a website on the Internet. They expect their homes to contain televisions, cable, videocassette recorders, compact discs, answering machines, fax machines, telephones, personal computers, and satellite dishes. Many work in occupations where they produce and distribute information that is of value to consumers and businesses. In other words, the lives of hundreds of millions—perhaps even billions—of people depend on information. On a typical morning, families around the world will turn on the radio to hear the news, read the morning paper, watch the weather channel on cable, make a telephone call, and gather office reports and schoolbooks—all before anyone leaves the house. In fact, they are so used to this way of living that they take information for granted.

What Is Information?

Many information users might find it hard to respond to the question "What is information?" To begin to form an answer, some basic observations are necessary. First of all, many words convey the idea of information—words such as "data," "knowledge," "writing," "speaking," "sign," and "symbol," to name just a few. Second, a name, a poem, a table of numbers, a novel, and a picture all contain a shared quality called "information." Third, most people will acknowledge that a message passed between two friends contains information. And, fourth, it is evident that many people put a high value on some information but not all information. Each of these four characteristics offers clues to answering the question "What is information?"

One clue can be found in an everyday behavior. People make decisions on a daily basis about all sorts of situations; and, when they do, they often seek information to help them make those decisions. When confronting a choice, people often find that they need more information in order to make a decision. For example, when considering the purchase of an automobile, a potential buyer might ask a neighbor for advice, read a consumer magazine, or take a car for a test drive. Each of these actions provides information to help the person decide which car to buy. Individuals who are confronting important decisions often seek as much information as they can obtain. In contrast, when facing a decision with little or no available information, individuals may hesitate to make a choice. Therefore, the human need to make decisions creates a demand for information and leads to the interest in understanding information.

Information scientists generally emphasize that individuals mostly use information to reduce uncertainty—that is, to clarify something of interest in order to make a decision. Most modern information scientists agree with a popular definition of information such as the following: "Information is a coherent collection of data, messages, or cues organized in a particular way that has meaning or use for a particular human system" (Ruben, 1988, p. 19). This definition may seem vague, but that is because information comes in so many forms.

Humans are always trying to make sense of the world around them, and for that they need data. In fact, anything can provide data because data are the raw stimuli that human brains use to produce information. An ocean, a thunderstorm, and a crowd all become data as soon as someone tries to make sense of them. A thunderstorm, for example, might be just so much noise and water to one person, but to a meteorologist, that same noise and water might form the beginning of an understanding of weather patterns. By observing the direction of the storm and measuring its force, the meteorologist is producing data. If the data is then organized into statistical tables, or into a weather report, the meteorologist will have transformed data into information. The weather report on the evening news, thus, conveys information that had its beginning in data. However, data only become information when organized into a form that can be communicated, such as the weather report. Data are the observations or cues collected in order to produce information, and information is what individuals share with each other when they communicate.

Humans build ideas from information. All humans convert data into information and then use information to reduce the uncertainty they face when making decisions—from simple decisions such as choosing a cereal for breakfast to complex decisions such as choosing a college. Furthermore, that same mental versatility gives humans the power to perform a truly remarkable feat. Every person can enter a room, close the door, shut off the lights (along with other stimuli), and emerge later with a new idea—that is, with new information. No old information was lost or consumed, yet new information is added, and for no additional expenditure of energy beyond the energy expended when thinking. In other words, the same amount of information can produce many new ideas without being used up.

Similarly, two individuals can receive the same information, think about it, and produce new information with opposing interpretations. What is remarkable is that the information each received was the same, while the new information produced was different: same input, different outputs because each brain is unique. Each human takes data as input, organizes the input into a form that produces new information, and then makes sense of it by relating it to other ideas, thus bringing forth individual knowledge. The brain can expend a quantity of energy and think no new thoughts, or it can expend that same quantity of energy and invent a new cure for cancer. Brains are so capable of manipulating information that they can recombine the same information into an infinite number of new ideas. Nothing in the world of physical things behaves this way.

The Idea of Information in the Sciences

Information has become a useful concept in the sciences. For example, cellular biologists speak of deoxyribonucleic acid (DNA) as a library that contains information; they consider genes to be information that is communicated to a new cell through mitosis. Economists discuss money as information that is increasingly transmitted across the Internet as electronic commerce. And, computer scientists consider each bit on a hard disk to be the smallest quantity of data. Each of these fields of inquiry has achieved advances by applying the idea of information to the problems that they study. However, whether they are actually describing information or employing the concept of information as a metaphor remains controversial. For example, some information scientists argue that when biologists describe DNA as a library that contains information, they are using the concept of information as a metaphor; some biologists, in contrast, argue that DNA is actually information. No one has yet developed a theory to explain this controversy. Nevertheless, in these fields and in others, the idea of information has been a useful concept for solving scientific problems. As a useful word in the English language, "information" has a very long history.

Information in Historic Language

The word that English speakers recognize as "information" has its origins in the Latin word informare. The Latin informare meant to give form to, to shape, to form an idea of, or even to describe, so the seed of the modern meaning can be discerned in the use of informare to mean the shaping of an idea in one's head—that is, to inform.

Geoffrey Chaucer introduced the word "information" into the English language in the "Tale of Melibee," one of his Canterbury Tales: "Whanne Melibee hadde herd the grete skiles and resons of Dame Prudence and hire wise informaciouns and techynges." The "Tale of Melibee" was probably written sometime between 1372 and 1382. Chaucer's use of the word "informaciouns" (informations) would roughly fit the meaning that contemporary English speakers give to the word "sayings." However, as time went by, other meanings gained greater popularity.

In Gulliver's Travels (1727), Jonathan Swift applied a meaning to the word "information" that appears as early as the mid-fifteenth century and sounds more familiar: "It was necessary to give the reader this information." Thomas Jefferson, in an 1804 letter, used "information" as if it referred to a physical object: "My occupations… deny me the time, if I had the information, to answer them."

In the twentieth century, scientists began to write as if information were a quantifiable variable, as in the following passage from the November 1937 issue of Discovery: "The whole difficulty resides in the amount of definition in the [television] picture, or, as the engineers put it, the amount of information to be transmitted in a given time." By the beginning of the twenty-first century, English speakers had adopted the senses of information as a physical object and quantifiable variable. Taken together, these uses facilitate communicating in an information society.

Information Versus Physical Objects

It seems so simple, but to make full use of the idea of information, people play a curious game with words. To create a language of information, people must adapt the language of the real world. In everyday conversations, individuals speak about information as though it were something made from physical materials. For example, a teacher might decide that one report contains more information than another. Such a comparison implies that information is a quantity that can be measured in terms of more and less. That same teacher might describe the reports as if they were jars filled with information, so that one report might be filled with more information than the other.

Of course, information does not fill jars, nor can it be easily determined when one has more or less information. Questions that are applicable to physical objects make less sense when applied to forms of information. What color is information? Is one idea bigger than another? Does one idea weigh more than another? When information is lost, where does it go? These questions seem illogical when asked of information because information is symbolic, not physical; that is, information exists as meaning in the minds of humans, whereas objects of the physical world take up space and exist whether or not anyone thinks about them. As a result of this difference, communicating about information poses a challenge because the English language has a limited vocabulary for representing the symbolic realm of information. English speakers solve this problem by employing words that are meant to describe the physical world. In other words, when members of the English-speaking world discuss information, they pretend that information is similar to a physical object. This makes sense even though information does not behave in the same way as a physical object.

Consider that if a person gives a sweater as a gift, the recipient gains the sweater, while the gift giver no longer has it; an exchange has resulted, and resources have moved from one place to another. The gift recipient has gained a sweater, while the gift giver has lost a sweater. One might even see this as a kind of law of nature—for an exchange to occur, someone must gain something and someone must lose something. This "law" applies to all physical objects from soup to nuts. However, information is different.

If, for example, someone writes a manuscript for a book, that person possesses a new manuscript—information—that did not exist before. If the manuscript is sold to a publisher, that publisher possesses the manuscript and may print it as a book. Clearly, an exchange has taken place; the publisher owns the information as a manuscript and the writer has money from the publisher. However, even though the writer sold the manuscript, he or she still has the information. The information remains in the writer's computer or perhaps in a folder; he or she can still read it out loud and even give a copy of the text to a friend. Unlike the example of the sweater, the writer has not lost the information by exchanging it for money.

This paradox of the physical world applies to all kinds of information. Whether the information in question is a manuscript or a piece of software or a movie or a poem, when it is given away, it still remains with the creator of the information. For unlike physical objects, information can be easily copied, and nearly always, it is the copy that is exchanged. Indeed, the ongoing revolution in information technology is all about exponential increases in the ease and fidelity with which information can be copied and transmitted. A person can experience this phenomenon by creating and copying a software file for a friend. The friend can use the file, though the original remains with the creator of the file. Once the friend has the file, he or she can copy it and distribute it to others. In this way, the potential spread of the file is unlimited. The file will continue to disperse as long as someone will copy it and pass it on to another person. Thus, whereas the exchange of a physical good, such as a sweater, requires the giver to relinquish it so the receiver can have it, information can be duplicated and exchanged with ease so both giver and receiver can have it at the same time. It would seem that information is without limits, unlike physical objects in the material world.

However, information can be treated as a physical object. The fact that a person can read from a book while physically holding it in his or her hands proves that information can be configured to the characteristics of a physical object. When information is recorded onto a physical medium, such as paper, tape, celluloid, plastic, or metal, the medium allows the information to be treated as a physical object. After all, a book can be shipped from New York to San Francisco in the same box with a sweater. Similarly, a compact disc (CD) can be transported thousands of miles away from the studio where it was cut. People make lists and carry them around in their pockets, in the same way that they carry keys, pens, and coins—all physical objects. People's daily lives are full of instances where they treat information as though it were a physical object. Nevertheless, regardless of how tangible the form, it is still information—an encyclopedia retains all of the information printed in it no matter how much one copies from it.

The packaging of information, which confers on it the characteristics of a physical object, also helps its transformation into an economic good. The facility with which information can be distributed, in part because of its remarkable ease of duplication, encourages entrepreneurs to explore the commercial possibilities. And, because information can be exchanged for profit, there can emerge markets for information of any kind for which there is a buyer and a seller. Increasingly, the production and distribution of information dominates the U.S. economy. The largest corporations generate information as their product and trade it around the world. Here, then, is the basis for the vast information economy that binds together the economic systems of the world.

Even so, the very ease of duplication that makes information so potentially profitable marks its own Achilles' heel. Information can be duplicated so easily that others can take the information being sold, copy it, and sell it too, whether it belongs to them or not. All sellers of information must contend with the possibility that the information they hope to sell can easily be copied and resold by others. The more illegal copies sell, the less incentive there is for the legitimate producer to offer information for sale. A software company may decide to take a program off the market because it is losing too much money as a result of illegal copying. When too many illegal copies circulate, the legal seller loses the incentive to sell because he or she is not making a profit from the sale of that information. In this way, the selling of illegal copies threatens the legal sale of information and discourages legal sellers from offering their goods. As a result, valuable and interesting information may be kept off the market. When that happens, everyone suffers.

Because information is valued so highly, solutions have emerged to reconcile the vulnerability of information to the characteristics of the physical world. In the world of physical economic goods, producers, sellers, and buyers maintain order through contracts. The same can be applied to the intangible world of information. When the writer in the example above sells the manuscript to the publisher, he or she has agreed to a very important limitation. Even though the writer still possesses the information, the publisher controls the distribution of it. In effect, though the writer may still have the text of the manuscript stored in his or her computer, the writer cannot offer it for sale; whereas, the publisher may bring the information in the manuscript to market with the exclusive right to sell it. As a result, the ability to control the availability of goods, which is so critical to physical markets, can be artificially created for information.

The menace to the orderly functioning of information markets is so threatening that governments have also stepped in and legislated protections for the sale and purchase of information. These laws generally fall under the legal class of copyrights and patents. By registering a text such as a song, or a design such as an invention, with the proper government agency, an individual or organization receives an exclusive right to sell and distribute that information for a fixed period of years. Should anyone else attempt to distribute the same information without permission, then the government is obligated to protect the owners by prosecuting those people who are guilty of illegal use. Without legal protections, no commercial producer of information could expect to profit from his or her product because anyone could take it and reproduce it. Fortunately, legal protections against unlawful copying function well enough that information markets thrive.

If the easy duplication of information poses a threat to the orderly functioning of markets, that same attribute offers an advantage to another group of enterprising producers. These individuals write software and then make it available as shareware and freeware. These two unusual kinds of software take advantage of the remarkable ease with which information can be copied and distributed. The author of a shareware program says in effect, "Here is my product. If you like it pay me for it. If you don't want to pay me, then keep it anyway." Surprisingly, software authors who introduce shareware can be quite successful; their product may take off, or they may be hired by a software firm that is impressed with their code-writing abilities. Clearly, the success of shareware depends both on distribution within the market and on distribution outside of the market, because it is both sold and given away. Yet, even in this strange hybrid of selling and giving, the heart of the strategy lies in the essential feature of information, the fact that the producer retains the information after distributing it to others.

Clearly, the way in which individuals think about information influences the way in which they act. People are so comfortable imagining all manner of possibilities for the uses of information that new information applications are invented with ease. However, no two people interpret information in exactly the same way, nor do they place the same value on it. If the first major feature of information is its ease of duplication, then its second major feature is its subjective value. Information conveys a different meaning to each person and, consequently, a different value to each person. Take, for example, a reading of the Iliad. One person might read it to learn about the culture of preclassical Greece. A second person might read it as an allegory of the role of the hero in Western literature. A third person might read it as a mighty adventure story. The text remains identical in every reading, but each reader takes fully different meanings from it. This occurs because the meaning of the information in the Iliad, as with any piece of information, rests not in the text or content but in the mind of the reader. Every person's brain contains a unique configuration of knowledge, and it is within that context that new information receives its special meaning.

Ultimately, the meaning and value of information is subjective. The progression whereby humans convert data into information and then frame it within their previous thoughts and experiences results in knowledge. Information is produced from data, and then knowledge is produced from information. Thus, knowledge is an attribute of an individual. When individuals seek to communicate knowledge, they have to transform it back into information that can be communicated. All knowledge, then, is derived from information and grounded in the ideas that humans have previously communicated to each other; even new ideas derive from the accumulation of previous ideas. Were it not so, humans would find it even more difficult to communicate with each other than is already the case because in that situation the basis for understanding would be harder to achieve. Without information, there could be no individual consciousness; and without communication, there would be no society.


Information possesses an amazing capacity for duplication; with information technology, people possess an ever-increasing capacity to duplicate and transmit information. Moreover, each individual derives a personal subjective meaning from any piece of information. However, the fundamental condition that characterizes the Information Age is the ease with which people think of information as a physical object. By describing, selling, storing, and transporting information as though it were a physical object, modern individuals achieve the tremendous accomplishment of an information economy. Economic innovations (e.g., new markets for information) and social perspectives that are derived from this attitude (e.g., judging a newspaper by the "amount" of information contained in it) have become so common that they are taken for granted. This idea of information—treating information as though it is a physical thing—stands as the base of the information society, because it directs thinking in such a way as to encourage the information economy, as well as the language with which individuals make sense of the information society.

See also:Computer Software; Copyright; Economics of Information; Ethics and Information; Home as Information Environment; Human Information Processing; Information Industry; Information Society, Description of; Language and Communication; Language Structure; Preservation and Conservation of Information; Reference Services and Information Access; Research Methods in Information Studies; Retrieval of Information; Standards and Information; Symbols; Use of Information; Visualization of Information.


Artandi, Susan. (1973). "Information Concepts and Their Utility." Journal of the American Society for Information Science 24(4):242-245.

Braman, Sandra. (1989). "Defining Information: An Approach for Policymakers." Telecommunications Policy 13(3):233-242.

Brown, John Seely, and Duguid, Paul. (2000). The Social Life of Information. Boston: Harvard Business School Press.

Buckland, Michael K. (1991). "Information as Thing." Journal of the American Society for Information Science 42(5):351-360.

Negroponte, Nicholas. (1995). Being Digital. New York: Vintage Books.

Ruben, Brent D. (1988). Communication and Human Behavior, 2nd edition. New York: Macmillan.

Schement, Jorge R. (1993). "An Etymological Exploration of the Links between Information and Communication." Information and Behavior 4:173-189.

Jorge Reina Schement


views updated May 18 2018


Science, technology, and ethics are all forms of information that depend on information to work. Furthermore there exist sciences, technologies, and ethics of information. To disentangle some of the main relations among these aspects of information, it is helpful to start with a simple example.

Monday morning. John turns the ignition key of his car, but nothing happens: The engine does not even cough. Not surprisingly the low-battery indicator is flashing. After a few more unsuccessful attempts, John calls the garage and explains that, last night, his wife had forgotten to turn off the car's lights—this is a lie, John did but is too ashamed to admit it—and now the battery is dead. John is told that the car's operation manual explains how to use jumper cables to start the engine. Luckily his neighbor has everything John needs. He follows the instructions, starts the car, and drives to the office.

This everyday example illustrates the many ways in which people understand one of their most important resources: information. The information galaxy is vast, and this entry will explore only two main areas: information as content and information as communication. The reader interested in knowing more about the philosophical analysis of the concept should consult the work of Jaakko Hintikka and Patrick Suppes (1970), Philip P. Hanson (1990), and Fred I. Dretske (1999).

Information as Content

It is common to think of information as consisting of data (Floridi 2005). An intuitive way of grasping the notion of data is to imagine an answer without a question. Ultimately data may be described as relational differences: a 0 instead of a 1; a red light flashing; a high or low charge in a battery.

To become information, data need to be well-formed and meaningful. Well-formed means that data are clustered together correctly, according to the rules (syntax) of the chosen language or code. For example, the operation manual from the example above shows the batteries of two cars placed one next to, not one on top of, the other. Meaningful indicates that the data must also comply with the meanings (semantics) of the chosen language or code. So the operation manual contains illustrations that are immediately recognizable.

When meaningful and well-formed data are used to talk about the world and describe it, the result is semantic content (Bar-Hillel and Carnap 1953, Bar-Hillel 1964). Semantic content has a twofold function. Like a pair of pincers, it picks up from or about a situation, a fact, or a state of affairs f, and models or describes f. The battery is dead carves and extracts this piece of information—that the battery of the car is dead—and uses it to model reality into a semantic world in which the battery is dead. Whether the work done by the specific pair of pincers is satisfactory depends on the resource f (realism) and on the purpose for which the pincers are being used (teleologism). Realistically the battery is dead is true. Teleologically it is successful given the goal of communicating to the garage the nature of the problem. The battery is dead would be realistically false and teleologically unsatisfactory if it were used, for instance, to provide an example of something being deceased.

INFORMATION AS TRUE SEMANTIC CONTENT. True semantic content is perhaps the most common sense in which information can be understood (Floridi 2005). It is also one of the most important ways, since information as true semantic content is a necessary condition for knowledge. Some elaboration of this concept is in order. First the data that constitute information allow or invite certain constructs and resist or impede others. Data in this respect work as constraining affordances. Second the data are never accessed and elaborated independently of a level of abstraction (LoA). An LoA is like an interface that establishes the scope and type of data that will be available as a resource for the generation of information (Floridi and Sanders 2004). The battery is what provides electricity to the car is a typical example of information elaborated at a driver's LoA. An engineer's LoA may output something like a 12-volt lead-acid battery is made up of six cells, each cell producing approximately
2.1 volts, and an economist's LoA may suggest that a good quality car battery will cost between $50 and $100 and, if properly maintained, it should last five years or more. Data as constraining affordances—answers waiting for the relevant questions—are transformed into information by being processed semantically at a given LoA (alternatively the right question is associated to the right data at a given LoA).

Once information is available, knowledge can be built in terms of justified or explained information, thus providing the basis of any further scientific investigation. One knows that the battery is dead not by merely guessing correctly, but because one sees the red light of the low-battery indicator flashing and perceives that the engine does not start. The fact that data count as resources for information, and hence for knowledge, rather than sources, provides a constructionist argument against any representationalist theory that interprets knowledge as a sort of picture of the world.

An instance of misinformation arises when some semantic content is false (untrue) (Fox 1983). If the source of the misinformation is aware that the semantic content is false, one may speak of disinformation, for example my wife left the lights on. Disinformation and misinformation are ethically censurable but may be successful teleologically: If one tells the mechanic that one's wife left the lights on last night, the mechanic will still be able to provide the right advice. Likewise information may fail to be teleologically successful; just imagine telling the mechanic that one's car is out of order.

INSTRUCTIONAL INFORMATION. True semantic content is not the only type of information. The operation manual, for example, also provides instructional information, either imperatively—in the form of a recipe: First do this, then do that—or conditionally—in the form of some inferential procedure: If such and such is the case do this, otherwise do that. Instructional information is not about f and does not model f: It constitutes or instantiates f, that is, it is supposed to make f happen. The printed score of a musical composition or the digital files of a program are typical cases of instructional information. The latter clearly has a semantic side. And semantic and instructional information may be joined in performative contexts, such as christening a vessel—for example, "this ship is now called HMS The Informer"—or programming—for example, when declaring the type of a variable. Finally the two types of information may come together in magic spells, where semantic modeling is confused with instructional power and control. Yet, as a test, one should recall that instructional information does not qualify alethically (from aletheia, the Greek word for truth). In the example, it would be silly to ask whether only use batteries with the same-rated voltage is true or false.

ENVIRONMENTAL INFORMATION. When John turned the ignition key, the low-battery indicator flashed. He translated the flashing into (a) semantic information: The battery is dead; and (b) instructional information: The battery needs to be charged or replaced. However the flashing of the indicator is actually an example of environmental information.

Environmental information may be described as natural data: It requires two systems a and b to be coupled in such a way that a being (of type, or in state) F is correlated to b being (of type, or in state) G, thus carrying to the observer the information that b is G (Jon Barwise and Jerry Seligman provide a similar analysis based on Dretske 1999). The correlation is usually nomical (it follows some law). It may be engineered—as in the case of the low-battery indicator (a) whose flashing (F) is triggered by, and hence is informative about, the battery (b) being dead (G). Or it may be natural, as when litmus—a coloring matter from lichens—is used as an acid-alkali indicator (litmus turns red in acid solutions and blue in alkaline solutions). Other typical examples include the correlation between fingerprints and personal identification, or between the age of a plant and its growth rings.

One may be so used to equating the low-battery indicator flashing with the information (that is, meaning) that the battery is dead as to find it hard to distinguish sufficiently between environmental and semantic information. However it is important to remember that environmental information may require or involve no semantics at all. It may consist of correlated data understood as mere differences or constraining affordances. Plants (e.g., a sunflower), animals (e.g., an amoeba) and mechanisms (e.g., a photocell) are certainly capable of making practical use of environmental information even in the absence of any (semantic processing of) meaningful data. Figure 1 summarizes the main distinctions introduced so far.

FIVE TYPES OF INFORMATION. More detail may now be added. First it should be emphasized that the actual format, medium, and language in which information is encoded is often irrelevant. The same semantic, instructional, and environmental information may be analog or digital, printed on paper or viewed on a screen, or in English or some other language. Second thus far it has been implicitly assumed that primary information is the central issue: things like the low-battery indicator flashing, or the words the battery is dead spoken over the phone. But remember how John discovered that the battery was dead. The engine failed to make any of the usual noises. Likewise in Sir Arthur Conan Doyle's Silver Blaze (1892), Sherlock Holmes solves the case by noting something that has escaped everybody else's attention, the unusual silence of the dog. Clearly silence may be very informative. This is a peculiarity of information: Its absence may also be informative. When it is, the difference may be explained by speaking of secondary information.

Apart from secondary information, three other typologies are worth some explanation since they are quite common (the terminology is still far from being standard or fixed, but see Floridi 1999b). Metainformation is information about the nature of information. "The battery is dead is encoded in English" is a simple example. Operational information is information about the dynamics of information. Suppose the car has a yellow light that, when flashing, indicates the entire system that checks that the electronic components of the car is malfunctioning. The fact that the light is off indicates that the low-battery indicator is working properly, thus confirming that the battery is indeed dead. Finally derivative information is information that can be extracted from any form of information whenever the latter is used as a source in search of patterns, clues, or inferential evidence, namely for comparative and quantitative analyses. From a credit card bill concerning the purchase of gasoline, one may derive information about the cardholder's whereabouts at a given time.

Information as Communication

Also important is the concept of information as communication, as in the sense of a transmitted message (Cherry 1978). Some features of information are intuitively quantitative. Information can be encoded, stored, and transmitted. One also expects it to be additive (information a + information b = information a + b) and non-negative. Similar properties of information are investigated by the mathematical theory of communication (MTC, also known as information theory; for an accessible introduction, see Jones 1979).

MTC was developed by Claude E. Shannon (Shannon and Weaver 1998 [1949]) with the primary aim of devising efficient ways of encoding and transferring data. Its two fundamental problems are the ultimate level of data compression (how small can a message be, given the same amount of information to be encoded?) and the ultimate rate of data transmission (how fast can data be transmitted over a channel?). To understand this approach, consider the telephone call to the garage.

The telephone communication with the mechanic is a specific case of a general communication model. The model is described in Figure 2.

John is the informer, the mechanic is the informee, the battery is dead is the message (the informant), there is a coding and decoding procedure through a language (English), a channel of communication (the telephone system), and some possible noise. Informer and informee share the same background knowledge about the collection of usable symbols (the alphabet).

MTC treats information as only a selection of symbols from a set of possible symbols, so a simple way of grasping how MTC quantifies raw information is by considering the number of yes/no questions required to guess what the informer is communicating. When a fair coin is tossed, one question is sufficient to guess whether the outcome is heads (h) or tails (t). Therefore a binary source, like a coin, is said to produce one bit of information. A two-fair-coins system produces four ordered outputs: <h, h, h, t, t, h, t, t> and therefore requires two questions, each output containing two bits of information, and so on. In the example, the low-battery indicator is also a binary device: If it works properly, it either flashes or it does not, exactly like a tossed coin. And since it is more unlikely that it flashes, when it does, the red light is very informative. More generally the lower the probability of p the more informative the occurrence of p is (unfortunately this leads to the paradoxical view that a contradiction—which has probability 0—is the most informative of all contents, unless one maintains that, to qualify as information, p needs to be true [Floridi 2004]).

Before the coin is tossed, the informee does not know which symbol the device will actually produce, so it is in a state of data deficit equal to 1 (Shannon's uncertainty). Once the coin has been tossed, the system produces an amount of raw information that is a function of the possible outputs, in this case two equiprobable symbols, and equal to the data deficit that it removes. The reasoning applies equally well to the letters used in your telephone conversation with the mechanic.

The analysis can be generalized. Call the number of possible symbols N. For N = 1, the amount of information produced by a unary device is 0. For N = 2, by producing an equiprobable symbol, the device delivers one unit of information. And for N = 4, by producing an equiprobable symbol, the device delivers the sum of the amount of information provided by coin A plus the amount of information provided by coin B, that is two units of information. Given an alphabet of N equiprobable symbols, it is possible to rephrase some examples more precisely by using the following equation: log 2 (N) = bits of information per symbol.

Things are made more complicated by the fact that real coins are always biased, and so are low-battery indicators. Likewise in John's conversation with the mechanic a word like batter will make y as the next letter almost certain. To calculate how much information a biased device produces, one must rely on the frequency of the occurrences of symbols in a finite series of occurrences, or on their probabilities, if the occurrences are supposed to go on indefinitely. Once probabilities are taken into account, the previous equation becomes Shannon's formula (where H = uncertainty, what has been called above data deficit):

The quantitative approach just outlined plays a fundamental role in coding theory, hence in cryptography, and in data storage and transmission techniques, which are based on the same principles and concepts. Two of them are so important as to deserve a brief explanation: redundancy and noise.

Redundancy refers to the difference between the physical representation of a message and the mathematical representation of the same message that uses no more bits than necessary. It is basically what can be taken away from a message without loss in communication. John's statement that his wife was responsible for the dead battery was redundant.

Compression procedures work by reducing data redundancy, but redundancy is not always a bad thing, for it can help to counteract equivocation (data sent but never received) and noise (received but unwanted data, like some interference). A message + noise contains more data than the original message by itself, but the aim of a communication process is fidelity, the accurate transfer of the original message from sender to receiver, not data increase. The informee is more likely to reconstruct a message correctly at the end of the transmission if some degree of redundancy counterbalances the inevitable noise and equivocation introduced by the physical process of communication and the environment. This is why, over the phone, John said that the battery is dead and that the lights were left on last night. It was the by whom that was uselessly redundant.

MTC is not a theory of information in the ordinary sense of the word. The term raw information has been used to stress the fact that in MTC information has an entirely technical meaning. Two equiprobable yeses contain the same quantity of raw information, regardless of whether their corresponding questions are Is the battery dead? or Is your wife missing? Likewise if one knows that a device could send with equal probabilities either this whole encyclopedia or just a quote for its price, by receiving one or the other message one would receive very different quantities of data bytes but only one bit of raw information. Since MTC is a theory of information without meaning, and since information – meaning = data, mathematical theory of data communication is a far more appropriate description than information theory.

MTC deals not with semantic information itself but with messages constituted by uninterpreted symbols encoded in well-formed strings of signals, so it is commonly described as a study of information at the syntactic level. This generates some confusion because one may think the syntactic versus semantic dichotomy to be exhaustive. Clearly MTC can be applied in information and communication technologies (ICT) successfully because computers are syntactical devices. It is often through MTC that information becomes a central concept and topic of research in disciplines like chemistry, biology, physics, cognitive science, neuroscience, the philosophy of information (Floridi 2002, Floridi 2004a), and computer ethics (Floridi 1999a).


SEE ALSO Computer Ethics;Cybernetics;Digital Libraries;Geographic Information Systems;Information Overload;Information Society;Internet;Wiener, Norbert.


Bar-Hillel, Yehoshua. (1964). Language and Information: Selected Essays on Their Theory and Application. Reading, MA; London: Addison-Wesley. Important collection of relevant essays by one of the philosophers who first tried to apply information theory to semantic information.

Bar-Hillel, Yehoshua, and Rudolf Carnap. (1953). "An Outline of a Theory of Semantic Information." In Language and Information: Selected Essays on Their Theory and Application Reading, MA; London: Addison-Wesley. Influential attempt to quantify semantic information.

Barwise, Jon, and Jerry Seligman. (1997). Information Flow: The Logic of Distributed Systems. Cambridge, UK: Cambridge University Press. Applies situation logic to the study of the dynamics of information.

Cherry, Colin. (1978). On Human Communication: A Review, a Survey, and a Criticism, 3rd edition. Cambridge, MA: MIT Press.

Dretske, Fred I. (1999). Knowledge and the Flow of Information. Stanford, CA: CSLI Publications. Originally published in 1981, Cambridge, MA: MIT Press. A simple and informative introduction to the mathematical theory of communication, even if no longer up-to-date.

Floridi, Luciano. (1999a). "Information Ethics: On the Theoretical Foundations of Computer Ethics." Ethics and Information Technology 1(1): 37–56. Provides a first foundation for an approach in computer ethics known as information ethics.

Floridi, Luciano. (1999b). Philosophy and Computing: An Introduction. London, New York: Routledge. Textbook introduction for philosophy students to computer science and its conceptual challenges.

Floridi, Luciano. (2002). "What Is the Philosophy of Information?" Metaphilosophy 33(1–2): 123–145. Provides a definition of the philosophy of information as a area of research.

Floridi, Luciano. (2004). "Outline of a Theory of Strongly Semantic Information." Minds and Machines 14(2): 197–222. Defends a theory of semantic information based on a veridical interpretation of information, in order to solve the Bar-Hillel-Carnap paradox.

Floridi, Luciano, and J. W. Sanders. (2004). "The Method of Abstraction" In Yearbook of the Artificial: Nature, Culture and Technology: Models in Contemporary Sciences, ed. Massimo Negrotti. Bern: Peter Lang. Develops the method of abstraction for philosophical analysis.

Floridi, Luciano. (2005). "Is Information Meaningful Data?" Philosophy and Phenomenological Research 70(2): 351–370. Defends the thesis that information encapsulates truth, hence that "false" in "false information" is to be understood as meaning "not authentic."

Fox, Christopher J. (1983). Information and Misinformation: An Investigation of the Notions of Information, Misinformation, Informing, and Misinforming. Westport, CT: Greenwood Press. Overview of the concepts mentioned in the title from an information science perspective.

Hanson, Philip P., ed. (1990). Information, Language, and Cognition. Vancouver: University of British Columbia Press. Proceedings of an influential conference on semantic information.

Hintikka, Jaakko, and Patrick Suppes, eds. (1970). Information and Inference. Dordrecht, The Netherlands: Reidel. Influential collection of essays on the philosohy of semantic information.

Jones, Douglas Samuel. (1979). Elementary Information Theory. Oxford: Clarendon Press. Excellent introduction to the mathematical theory of communication for the mathematically uninitiated.

Shannon, Claude E., and Warren Weaver. (1998 [1949]). The Mathematical Theory of Communication. Urbana: University of Illinois Press. Classic presentation of the fundamental results in the mathematical theory of communication.


views updated May 23 2018


The word information is used in three principal senses: (1) the mathematical sense from which arises the theory of digital communication or information theory; (2) the linguistic sense in which it is synonymous with the dissemination of meanings understood by members of a culture; and (3) the formative sense in which information denotes the process of giving shape to some medium or substance.

Kinds of information

Counting-information is mathematical information as defined by American mathematician and engineer Claude Shannon (19162001) in a paper on communication theory written in 1948. It has nothing directly to do with meaning; rather it relates solely to an arbitrary measure based upon the theory of probability.

Meaning-information is information in the colloquial sense of knowledge. It is completely different from Shannon's concept of information; it is interpretation-, language-, and culture-dependent.

Shaping-information denotes information as a noun describing the action of giving form to something. It is the oldest sense of the word, originating in the Latin verb informare, further reflected in current usage in the German informieren and the French informer. In this sense, one can speak of the "information" of a system when one imposes constraints upon its degrees of freedom, for example by giving content and structure to a spreadsheet.

Construed in these three ways, information crosses boundaries between physics, culture, and mind. In its modern, counting-information sense, especially in the realm of information technology, it seems to have taken on a life of its own, as if the process of rendering things digitally had some intrinsic value apart from its use in conveying meaning and enabling people to shape the world. As with any new technologythe telephone, the television, the motor car, the mobile phonethere is a period during which fascination with the technology itself supplants the wisdom that governs its use, but eventually the more important purposes resume their ascendancy, and the technology once again comes to be seen as no more than a tool.

The religious significance of the science of information is best understood in terms of the articulation of meaning and the establishment of a balanced view of the place of information in human life. That process is in full swing as digitization, the Internet, global communication, and the dissolution of historical boundaries reshape how people conceive of themselves and how they decide to live their lives.

If technology is to serve rather than dictate human needs, it is essential that people retain their capacity to think creatively, which is to generate the ideas that give shape to the technology by investing it with significant meanings. Otherwise human needs will increasingly be at the mercy of the agendas of those individuals, corporations, and nation-states that control the technology, and people will be powerless to resist their influence by giving expression to their own objectives. Articulation of worthy religious goals is one contribution that theology can make to the restoration of the balance between creative thought and technological power.


The mathematical concept of counting-information is based upon binary arithmetic, on the ability to distinguish between two states, typically represented as 0 and 1, in an electronic device. One such distinguishable state is called a binary unit or bit. Combinations of these states allow data to be encoded in strings, such as 01110101010, that can be stored in two-state devices and transmitted down communication channels. Electronic circuits that can distinguish between only two states are relatively easy to devise, although higher-state devices are possible. The process of encoding facts about the world in such binary strings is called digitization, although any particular encoding is arbitrary.

A string of n bits can exist in 2n; different states and so can represent 2n different symbols. For example, when n = 3, the string can be 000, 001, 010, 011, 100, 101, 110, or 111. If a particular encoding treats these strings as binary numbers, they represent 0, 1, 2, . . . , 7; another encoding might treat them as a, b, . . . , h. In the early years of computing it was thought that 256 different strings would be sufficient to encode most common letters, numbers, and control codes. The number of bits required to store a given amount of data is therefore usually measured in eight-bit units called bytes because of the number of different states of a single byte (28 = 256). Numbers of bits are counted in powers of 2, so a kilobyte is 210 = 1024 bytes; a megabyte is 1024 kilobytes (1024K); and a gigabyte is 1024 megabytes. Typical hard disks can now store between 20 and 100 gigabytes.

The states of a binary system are typically called 0 and 1, True and False, or Yes and No. The system itself is oblivious to these interpretations of the two possible states of a bit, and it is helpful to distinguish between system states and interpretations of those states, for example using the terminology of counting-, meaning- and shaping-information.

The physics of information

The physics of information has given rise to some remarkable results. Shannon showed that there are limits to the rate at which information can be transmitted down a channel with a particular capacity if it is to retain its integrity. Leo Szilard and Leon Brillouin demonstrated that there are fundamental limits to the rate at which information can be processed at given temperatures. Jacob Bekenstein showed that the amount of information that an object can containthe Bekenstein boundis directly related to its mass. Some, such as Carl Friedrich von Weizsäcker, have attempted to reconstruct all of physics in information-theoretic terms by conceiving of all physical processes as streams of information. Still others have employed information to look for a fundamental link between entropy and thermodynamics.

The ability to transfer information digitally requires data to be encoded in a binary form; the limitations of such transmission are the subject of information theory as first elaborated by Shannon. However, information is not always easily converted to digital form, especially when it arises from continuous analogue processes, when strict conversion into a discrete coded form is not possible. Neither are the processes that arise from and are useful to human beings easily distilled into the pure digital states required by computers. Some of the most difficult problems faced by those who work in information technology concern the accommodation of computer systems to the untidiness of the data and processes that are typical of human life.

The question of the fundamental nature of information is philosophically and physically deep. It is irrelevant whether one counts to base 2 (as in binary systems) or some other base, but the question of what one is measuring cannot be avoided, and touches some of the hardest questions in physics.

The state of a bit cannot be detected without degrading energy and so increasing the net entropy of the universe. This familiar phrase encapsulates the physical truth that one cannot obtain something for nothing. The Scottish physicist James Clerk Maxwell (18311878) once proposed a thought experiment in which a demon capable of detecting the movement of molecules of gas could open and close a trapdoor to allow fast molecules through and keep slow atoms out, thus increasing the temperature of one side of the partition and infringing the second law of thermodynamics. It is now generally accepted that the flaw in this argument arises from the need to increase the entropy of the universe in order to ascertain the state of the molecule; in other words, reading a certain number of bits of information has a thermodynamic cost.

Encoding and encryption

Although encryption is important in the social and political realms affected by information technology, the fundamentals are mathematical and fall within the realm of information theory. The details of modern encryption involve difficult mathematics, but the essential process is not hard to understand. In a simple code or cipher one typically expects to move from an everyday symbol such as 1 or a to a binary string such as 000, to store and manipulate that string in a computer, and then to decode the result by reversing the encoding process. Unfortunately, anyone familiar with the encoding can decode the results, and there are times when one does not wish one's messages to be readperhaps because they contain private commercial information, perhaps because they contain the plans of criminals or terrorists, perhaps because they contain state secrets. So people would like a way to transmit messages in code. But, if the recipient is to decode them, it seems that the decoding rules must also be transmitted, and they could themselves be intercepted, thus compromising the integrity of the message. What is more, it is far harder to know whether an electronic communication has been intercepted than a physical communication such as a book or letter. Instead people need a way to transmit code that does not require the recipient to be told what the encoding process involves. Fortunately, a way to do this has been devised. It is now embodied in the RSA procedure and is as strong or as weak as the number of bits employed in the encryption. This procedure works as follows. Two very large prime numbers that are intrinsically difficult to guess or find (the private key) are used with another number to generate a pair of numbers (the public key) that everyone knows. This process is essentially irreversible in that there is no tractable way to regenerate the original two prime numbers from the public key. This key is then used by anyone who wishes to send me an encoded message to encrypt it, and I, when I receive it, by using my private key, can decode it. Anyone intercepting the encrypted message, even if in possession of the public key, cannot decrypt the message, because they cannot get back to the private key necessary to do so. The strength of the system lies in the size of the public key: a 40-bit number is deemed very difficult to crack; a 128-bit number is deemed almost impossible with current hardware; a 256-bit number could not be decrypted within the lifetime of the universe.

See also Information Technology; Information Theory


leff, harvey s., and rex, andrew f. maxwell's demon: entropy, information, computing. bristol, uk: adam hilger, 1990.

puddefoot, john c. "information and creation." in the science and theology of information: proceedings of the third european conference on science and theology, ed. christoff wassermann, richard kirby, and bernard rordorff. geneva, switzerland: labor et fides, 1992.

puddefoot, john c. "information theory, biology, and christology." in religion and science: history, method, dialogue, eds. w. mark richardson and wesley j. wildman. new york and london: routledge, 1996.

rényi, alfréd. a diary on information theory. new york: wiley, 1984.

shannon, claude e. "a mathematical theory of communication." the bell system technical journal 27 (1948): 379423, 623656.

singh, simon. the code book: the science of secrecy from ancient egypt to quantum cryptography. london: fourth estate, 1999.

john c. puddefoot


views updated May 14 2018


An information is a formal written accusation against a person for a criminal offense presented under oath by a public officer, usually a prosecutor. An information is used to charge an individual with criminal activity in cases where an indictment by a grand jury is unnecessary or is waived by the accused. Like an indictment, the filing of an information results in the commencement of a formal prosecution. Thus, the information must be clear and specific in order to give adequate notice to the accused of the charges against him and permit him to prepare his defense.

Most states permit prosecution by information or indictment at the option of the prosecutor. In these states, it is rare for a prosecutor not to use an information because it is easier and less time-consuming than an indictment. Grand jury indictments will be used in these jurisdictions only when the prosecutor wants to use the investigative powers of the grand jury. In other states, indictments are required in all felony cases or in all capital cases. However, even in these states, informations are used in misdemeanor cases and in felony cases where the accused has waived his right to a grand jury indictment.

In federal misdemeanor cases, prosecutors have the option under the federal rules of criminal procedure to proceed by indictment or information. In federal felony cases, accused individuals have the right to insist on prosecution by indictment, but this right can be waived in all but capital cases.

Most jurisdictions limit the prosecutor's discretion to file an information. Generally, the prosecutor cannot file an information unless the accused has had a preliminary hearing before a magistrate. This requirement is designed to weed out groundless charges, thereby relieving an accused of the burden of preparing a defense. However, the effectiveness of this limitation on prosecutorial abuse in filing informations is undercut in several ways. First, in most jurisdictions, a finding of no probable cause by one magistrate at a preliminary hearing does not preclude presenting the case to another magistrate. Thus, a prosecutor can "shop around" for a magistrate who will find the requisite probable cause and enable the prosecutor to file an information.

In addition, in filing an information, the prosecutor is not always bound by the findings of the magistrate at the preliminary hearing. Some states permit the prosecutor to charge the accused in the information only with the crimes for which the magistrate decided there was probable cause. In other states, the information can charge the offense for which the accused was bound over at the preliminary hearing and any other offenses supported by the evidence at the preliminary hearing.

Another problem with using the preliminary hearing as a check on the prosecutor's decision to file an information is that the prosecutor often dominates the magistrate's hearing. Furthermore, in Gerstein v. Pugh (1975), the Supreme Court implied that the federal Constitution does not require a preliminary judicial hearing to determine whether there is probable cause for the prosecutor to file an information.

Charles H. Whitebread


American Bar Association, Section of Criminal Justice 1977 Policy on the Grand Jury. Washington, D.C.: ABA Section of Criminal Justice.


views updated May 14 2018

in·for·ma·tion / ˌinfərˈmāshən/ • n. 1. facts provided or learned about something or someone: a vital piece of information. ∎ Law a formal criminal charge lodged with a court or magistrate by a prosecutor without the aid of a grand jury: the tenant may lay an information against his landlord.2. what is conveyed or represented by a particular arrangement or sequence of things: genetically transmitted information. ∎ Comput. data as processed, stored, or transmitted by a computer. ∎  (in information theory) a mathematical quantity expressing the probability of occurrence of a particular sequence of symbols, impulses, etc., as contrasted with that of alternative sequences.DERIVATIVES: in·for·ma·tion·al / -shənl/·for·ma·tion·al·ly / -shənl-ē/ adv.


views updated May 09 2018

information Generally, information is whatever is capable of causing a human mind to change its opinion about the current state of the real world. Formally, and especially in science and engineering, information is whatever contributes to a reduction in the uncertainty of the state of a system; in this case, uncertainty is usually expressed in an objectively measurable form. Commonly, this is done by means of Shannon's entropy. Nevertheless, this formula for uncertainty involves probabilities, and these may well have to be subjective. If that is so, the formal measurement must be qualified as depending on subjective probabilities, and “uncertainty” must be replaced by “opinion, or personal estimate, of uncertainty”.

Information must be distinguished from any medium that is capable of carrying it. A physical medium (such as a magnetic disk) may carry a logical medium (data, such as binary or text symbols). The information content of any physical objects, or logical data, cannot be measured or discussed until it is known what range of possibilities existed before and after they were received. The information lies in the reduction in uncertainty resulting from the receipt of the objects or the data, and not in the size or complexity of the objects or data themselves. Questions of the form, function, and semantic import of data are only relevant to information inasmuch as they contribute to the reduction of uncertainty. If an identical memorandum is received twice, it does not convey twice the information that its first occurrence conveyed: the second occurrence conveys no information at all, unless, by prior agreement, the number of occurrences is itself to be regarded as significant.

Information has ramifications in security, politics, culture, and the economy, as well as in science and engineering. The extent to which information is used as an economic commodity is one of the defining characteristics of the “post-industrial” society, hence the phrase “the information society”.


views updated May 11 2018


The formal accusation of a criminal offense made by a public official; the sworn, written accusation of a crime.

An information is tantamount to an indictment in that it is a sworn written statement which charges that a particular individual has done some criminal act or is guilty of some criminal omission. The distinguishing characteristic between an information and an indictment is that an indictment is presented by a grand jury, whereas an information is presented by a duly authorized public official.

The purpose of an information is to inform the accused of the charge against him, so that the accused will have an opportunity to prepare a defense.

About this article


All Sources -
Updated Aug 13 2018 About content Print Topic