Noiseless coding theorem

How is information source shorten?
There are M sources.

S=(a1,a2,・・, aM)


a is each information and p is possibility.

pi=P(ai)


You encode it.

K(ai)=ci


L is the length.


li=|K(ai)|

Entropy is I(E), and possibility of E is P(E).



EX.
This is the first dimension.

コメント

このブログの人気の投稿

The Sylvester-Gallai Theorem

Montgomery's pair correlation conjecture

Hybrid orbital