Question: 1.28 () In Section 1.6, we introduced the idea of entropy h(x) as the information gained on observing the value of a random variable x
1.28 () In Section 1.6, we introduced the idea of entropy h(x) as the information gained on observing the value of a random variable x having distribution p(x). We saw that, for independent variables x and y for which p(x, y) = p(x)p(y), the entropy functions are additive, so that h(x, y) = h(x)+h(y). In this exercise, we derive the relation between h and p in the form of a function h(p). First show that h(p2) =
2h(p), and hence by induction that h(pn) = nh(p) where n is a positive integer.
Hence show that h(pn/m) = (n/m)h(p) where m is also a positive integer. This implies that h(px) = xh(p) where x is a positive rational number, and hence by continuity when it is a positive real number. Finally, show that this implies h(p)
must take the form h(p) ∝ ln p.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
