Back to wiki: math , cs

# Information Theory Overview

Compression (efficiency, source coding) Error Correction (reliability, channel coding)
Information Theory (Math) Losless: source coding theorem, Kraft-McMillan inequality
Lossy: rate distortion theorem
Coding Methods (algorithms) Symbol Code: Huffman codes Hamming codes, BCH codes, Turbocodes, Gallager codes
Stream Codes: arithmetic coding

## Information

How many bits are needed to encode information:

$$log_{2}(\frac{1}{p})$$

Where $p$ is the probability of the event. For example, the number of bits needed for encoding the value of the result of a coin flip ($p=\frac{1}{2}$) is 1.

## Entropy

How many bits should be needed to send a piece of information?

$$H(X) = \sum p_{i} (log_{2}( \frac{1}{p_{1}} )) = E(I(X))$$

## Binary Tree Encoding (Huffman)

$$p\_{i}$$ Encoded
"A" $\frac{1}{3}$ 11
"B" $\frac{1}{2}$ 0
"C" $\frac{1}{12}$ 100
"C" $\frac{1}{12}$ 101

Tags: informationTheory