[CST-2] Information thry & neural nets
Crispin
chvc2@hermes.cam.ac.uk
Mon, 3 Jun 2002 02:10:30 +0100 (BST)
> 2) Information thry, lecture notes, page 42.
> He deduces that h(Y|X) = h(N) "since N is independent of X"... I don't
> understand. Anyone point me in the right direction?
In a noiseless channel, h(Y|X) = 0 as there is no uncertainty about X when
we have measured Y. (When calculating h the term log 1/p(y|x) is 0 as
p(y|x) is 1).
As Entropy is additive, the entropy of the noisy channel is 0 + the
entropy of the noise, i.e. H(N)
If the noise wasn't independent of X then H(N) would depend on P(X) as
well.
I hope. IANApersonwithmuchofaclue.
--
Crispin
This email account will die soon - please use crispin@cantab.net instead!