Also see Claude Shannon .
Entropy has meanings in physics and in communications theory. More generally, entropy means a process in which order deteriorates with the passage of time. It has several special meanings in data communications.
Many scientists believe that the universe is naturally evolving toward a state of maximum entropy. The example commonly cited is the tendency for thermal differences to disappear. According to the so-called theory of entropy, everything in the universe will ultimately attain a uniform temperature. There will be no energy sources or absorbers. Stars and galaxies as we know them will cease to exist. There will be no life.
In data communications, the term entropy refers to the relative degree of randomness. The higher the entropy, the more frequent are signaling errors. Entropy is directly proportional to the maximum attainable data speed in bps (bits per second). Entropy is also directly proportional to noise and bandwidth . The entropy in a signal is inversely proportional to compressibility; the greater the entropy, the smaller the factor by which the data can be compressed. Entropy also refers to disorder deliberately added to data in certain encryption processes.