Information theory is a branch of mathematics that overlaps into communications engineering, biology, medical science, sociology, and psychology. The theory is devoted to the discovery and exploration of mathematical laws that govern the behavior of data as it is transferred, stored, or retrieved.
In 1948, Claude Shannon, a mathematician at Bell Labs, published a paper entitled A Mathematical Theory of Communication. The paper caught the immediate attention of mathematicians and scientists worldwide. Several disciplines spun off as the result of reactions to this paper, including information theory, coding theory, and the entropy theory of abstract dynamical systems.
Whenever data is transmitted, stored, or retrieved, there are a number of variables such as bandwidth, noise, data transfer rate, storage capacity, number of channels, propagation delay, signal-to-noise ratio, accuracy (or error rate), intelligibility, and reliability. In audio systems, additional variables include fidelity and dynamic range. In video systems, image resolution, contrast, color depth, color realism, distortion, and the number of frames per second are significant variables. One of the most important applications of information theory is to determine the optimum system design for a given practical scenario.
Information theory is an evolving discipline and continues to generate interest among experimentalists and theorists.