Information theory, a mathematical representation of the conditions and parameters affecting the transmission and processing of information. Most closely associated with the work of the American electrical engineer Claude Shannon in the mid-20th century, information theory is chiefly of interest to communication engineers, though some of the concepts have been adopted and used in such fields as psychology and linguistics. Information theory overlaps heavily with communication theory, but it is more oriented toward the fundamental limitations on the processing and communication of information and less oriented toward the detailed operation of particular devices.

Historical background

Interest in the concept of information grew directly from the creation of the telegraph and telephone. In 1844 the American inventor Samuel F.B. Morse built a telegraph line between Washington, D.C., and Baltimore, Maryland. Morse encountered many electrical problems when he sent signals through buried transmission lines, but inexplicably he encountered fewer problems when the lines were suspended on poles. This attracted the attention of many distinguished physicists, most notably the Scotsman William Thomson (Baron Kelvin). In a similar manner, the invention of the telephone in 1875 by Alexander Graham Bell and its subsequent proliferation attracted further scientific notaries, such as Henri Poincaré, Oliver Heaviside, and Michael Pupin, to the problems associated with transmitting signals over wires. Much of their work was done using Fourier analysis, a technique described later in this article, but in all of these cases the analysis was dedicated to solving the practical engineering problems of communication systems.

The formal study of information theory did not begin until 1924, when Harry Nyquist, a researcher at Bell Laboratories, published a paper entitled “Certain Factors Affecting Telegraph Speed.” Nyquist realized that communication channels had maximum data transmission rates, and he derived a formula for calculating these rates in finite bandwidth noiseless channels. Another pioneer was Nyquist’s colleague R.V.L. Hartley, whose paper “Transmission of Information” (1928) established the first mathematical foundations for information theory.

The real birth of modern information theory can be traced to the publication in 1948 of Claude Shannon’s “A Mathematical Theory of Communication” in the Bell System Technical Journal. A key step in Shannon’s work was his realization that, in order to have a theory, communication signals must be treated in isolation from the meaning of the messages that they transmit. This view is in sharp contrast with the common conception of information, in which meaning has an essential role. Shannon also realized that the amount of knowledge conveyed by a signal is not directly related to the size of the message. A famous illustration of this distinction is the correspondence between French novelist Victor Hugo and his publisher following the publication of Les Misérables in 1862. Hugo sent his publisher a card with just the symbol “?”. In return he received a card with just the symbol “!”. Within the context of Hugo’s relations with his publisher and the public, these short messages were loaded with meaning; lacking such a context, these messages are meaningless. Similarly, a long, complete message in perfect French would convey little useful knowledge to someone who could understand only English.

Shannon thus wisely realized that a useful theory of information would first have to concentrate on the problems associated with sending and receiving messages, and it would have to leave questions involving any intrinsic meaning of a message—known as the semantic problem—for later investigators. Clearly, if the technical problem could not be solved—that is, if a message could not be transmitted correctly—then the semantic problem was not likely ever to be solved satisfactorily. Solving the technical problem was therefore the first step in developing a reliable communication system.

It is no accident that Shannon worked for Bell Laboratories. The practical stimuli for his work were the problems faced in creating a reliable telephone system. A key question that had to be answered in the early days of telecommunication was how best to maximize the physical plant—in particular, how to transmit the maximum number of telephone conversations over existing cables. Prior to Shannon’s work, the factors for achieving maximum utilization were not clearly understood. Shannon’s work defined communication channels and showed how to assign a capacity to them, not only in the theoretical sense where no interference, or noise, was present but also in practical cases where real channels were subjected to real noise. Shannon produced a formula that showed how the bandwidth of a channel (that is, its theoretical signal capacity) and its signal-to-noise ratio (a measure of interference) affected its capacity to carry signals. In doing so he was able to suggest strategies for maximizing the capacity of a given channel and showed the limits of what was possible with a given technology. This was of great utility to engineers, who could focus thereafter on individual cases and understand the specific trade-offs involved.

Shannon also made the startling discovery that, even in the presence of noise, it is always possible to transmit signals arbitrarily close to the theoretical channel capacity. This discovery inspired engineers to look for practical techniques to improve performance in signal transmissions that were far from optimal. Shannon’s work clearly distinguished between gains that could be realized by adopting a different encoding scheme from gains that could be realized only by altering the communication system itself. Before Shannon, engineers lacked a systematic way of analyzing and solving such problems.

Shannon’s pioneering work thus presented many key ideas that have guided engineers and scientists ever since. Though information theory does not always make clear exactly how to achieve specific results, people now know which questions are worth asking and can focus on areas that will yield the highest return. They also know which sorts of questions are difficult to answer and the areas in which there is not likely to be a large return for the amount of effort expended.

Since the 1940s and ’50s the principles of classical information theory have been applied to many fields. The section Applications of information theory surveys achievements not only in such areas of telecommunications as data compression and error correction but also in the separate disciplines of physiology, linguistics, and physics. Indeed, even in Shannon’s day many books and articles appeared that discussed the relationship between information theory and areas such as art and business. Unfortunately, many of these purported relationships were of dubious worth. Efforts to link information theory to every problem and every area were disturbing enough to Shannon himself that in a 1956 editorial titled “The Bandwagon” he issued the following warning:

I personally believe that many of the concepts of information theory will prove useful in these other fields—and, indeed, some results are already quite promising—but the establishing of such applications is not a trivial matter of translating words to a new domain, but rather the slow tedious process of hypothesis and experimental verification.

Additional Reading

External Links

Britannica Web sites

The era we are living in is sometimes called the age of information. But what is information, and how much of it is in any message? Let’s look at two situations to determine their information content. Suppose you planned to play tennis with a friend at a nearby park but a heavy rain prevents you from leaving the house. Then the telephone rings and your friend tells you the game is off because it is raining. This message holds no information because you already know it is raining. Suppose you planned to play, however, at a park by your friend’s house several miles away, the sky was overcast, and the weatherman the night before had said that the chance of rain in the morning was high. Then your friend calls and says the game is off because it is raining there. This message contains information because prior to it you were not certain whether or not it was raining at the park. Information, therefore, is anything that resolves uncertainty.