[CST-2] Information Theory

Timothy Hospedales tmh31@cam.ac.uk
Sat, 25 May 2002 23:14:37 +0100


Hi,
 Ive confused myself while thinking about Kolmogorov complexity and
Entropy. Suppose you have a string, s = 31415... which is the digits
of Pi. K(s) is very small - the text of a program to calculate Pi. But
given that the distribution of digits in Pi is supposed to be very
random, if you assess H(s) = Sum(p*Log(p)) over each digit, the
entropy should be very high, and you get H(s) >> K(s). Which is
different to H~=K, which is supposed to be the case.

 Anyone willing to clear this up for me?

Thanks alot,
Tim