Question: [24] For a computable probability distribution the expected value of Kolmogorov complexity equals the Shannon entropy up to the prefix Kolmogorov complexity of the probability
[24] For a computable probability distribution the expected value of Kolmogorov complexity equals the Shannon entropy up to the prefix Kolmogorov complexity of the probability distribution plus a constant
(Theorem 8.1.1 on page 626). Show a similar relationship between entropy and information distance and the relationship between entropy and the normalized version of information distance.
Comments. Source: [B. Hu, L. Bi, and S. Dai, Entropy, 19:6(2017), 260, doi:
10.3390/e19060260]. Compare with the more difficult Theorem 8.1.3 on page 629 and Exercise 8.1.5 on page 647. Hint: the (normalized) information distance is a distance between objects x and y. The source reference obtains formulas bounding the expectation of the (normalized) information distance between bounds containing conditional entropies and the upper bound contains an additive term of 2K(p) where p(x, y) is the computable joint distribution of the pairs x, y figuring in the (normalized) information distance. The formulas may be mathematically sound, but the lower and upper bounds on the expected value of the information distance differ by at least 2K(p). The magnitude of K(p) is in practical cases (genomes, languages, music, and so on) forbiddingly large since usually K(p(x, y)) > K(x), K(y) and similarly K(p) >
x K(x). The information distance between x and y is about max(K(x|y), K(y|x)) and its expectation vanishes with respect to K(p). Therefore, not only tell the expected values little or nothing about the individual values, also the difference between the upper bound and lower bound is too great to be of any use in practical situations. On the basis of these results entropy-based similarity methods and information distance-based ones are radically different.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
