Question: 5.9 Optimal code lengths that require one bit above entropy. The source coding theorem shows that the optimal code for a random variable X
5.9 Optimal code lengths that require one bit above entropy. The source coding theorem shows that the optimal code for a random variable X has an expected length less than H(X) + 1. Give an example of a random variable for which the expected length of the optimal code is close to H(X) + 1 [i.e., for any e > 0, construct a distribution for which the optimal code has L > H(X)+1-e].
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
