2020年10月25日日曜日

Noiseless coding theorem

How is information source shorten?
There are M sources.

S=(a1,a2,・・, aM)


a is each information and p is possibility.

pi=P(ai)


You encode it.

K(ai)=ci


L is the length.


li=|K(ai)|

Entropy is I(E), and possibility of E is P(E).



EX.
This is the first dimension.

0 件のコメント: