[CST-2] Information Theory
Anthony Jones
amj30@cam.ac.uk
Sun, 26 May 2002 00:09:48 +0100
On Sat, May 25, 2002 at 11:14:37PM +0100, Timothy Hospedales wrote:
> Ive confused myself while thinking about Kolmogorov complexity and
> Entropy. Suppose you have a string, s = 31415... which is the digits
> of Pi. K(s) is very small - the text of a program to calculate Pi. But
> given that the distribution of digits in Pi is supposed to be very
> random, if you assess H(s) = Sum(p*Log(p)) over each digit, the
> entropy should be very high, and you get H(s) >> K(s). Which is
> different to H~=K, which is supposed to be the case.
The digits aren't random at all, they just look random. They are actually
completely deterministic, so for each digit P(digit = some number) is equal
to either 1 or 0, and therefore p * log(p) is always zero, and the digits
have zero entropy. Which is approximately equal to the length of the short
program that calculates them.
Ant