# Question

Consider an N - letter source with probabilities, Pi, i = 1, 2, 3… N. The entropy of the source is given by

Prove that the discrete distribution that maximizes the entropy is a uniform distribution. Hint: You need to perform a constrained optimization since the source probabilities must form a valid probability mass function, and thus p1+ p2+…+ pN = 1.

Prove that the discrete distribution that maximizes the entropy is a uniform distribution. Hint: You need to perform a constrained optimization since the source probabilities must form a valid probability mass function, and thus p1+ p2+…+ pN = 1.

## Answer to relevant Questions

Consider a geometric random variable, Z , whose PMF is PZ( k) = ( 1 – p) pk , k = 0,1,2, … . Find the entropy of this random variable as a function of p. Use the characteristic function (or the moment- generating function or the probability-generating function) to show that a Poisson PMF is the limit of a binomial PMF with n approaching infinity and p approaching zero in such ...For the Rayleigh random variable described in Exercise 4.12, find a relationship between then th moment, E [Yn], and the n th moment of a standard normal random variable. Find an expression for the even moments of a Rayleigh ...Let cn be the n th central moment of a random variable and µ n be its n th moment. Find a relationship between cn and µk, k = 0, 1, 2… An exponential random variable has a PDF given by fX(x) = exp (– x) u (x) . (a) Find the mean and variance of X. (b) Find the conditional mean and the conditional variance given that X > 1Post your question

0