Introduction | To Coding And Information Theory Steven Roman
Mathematically, the information content ( h(x) ) of an event ( x ) with probability ( p ) is:
If I tell you something you already know (e.g., "The sun will rise tomorrow"), I have transmitted very little information. If I tell you something shocking (e.g., "The sun did not rise today"), I have transmitted a massive amount of information. Introduction To Coding And Information Theory Steven Roman
[ H = -\sum_{i=1}^{n} p_i \log_2(p_i) ]
If you receive a 7-bit string, you run the parity checks. The result (called the syndrome) is a binary number from 001 to 111. That number tells you exactly which bit to flip to fix the message. Mathematically, the information content ( h(x) ) of
Entropy is the average amount of information produced by a source. It is also the minimum number of bits required, on average, to encode the source without losing any information. The result (called the syndrome) is a binary
In Shannon’s world,
When your data corrupts, you are witnessing a violation of the Hamming distance. When your compression algorithm bloats instead of shrinks, you are witnessing low entropy.
