# Question

Suppose a random variable, X, has N = 2n equally likely outcomes. What is the entropy of X in bits?

## Answer to relevant Questions

Suppose a fair coin is flipped n times and the random variable Y counts the number of times heads occurs. What is the entropy of Y in bits? Compare your answer to that of Exercise 4.85 and explain any difference. A communication system sends data in the form of packets of fixed length. Noise in the communication channel may cause a packet to be received incorrectly. If this happens, then the packet is retransmitted. Let the ...Find the first three moments of a geometric random variable whose PMF is PN (n) = (1 – p) pn , n = 0,1, 2, … . Prove that all odd central moments of a Gaussian random variable are equal to zero. Furthermore, develop an expression for all even central moments of a Gaussian random variable. Show that the concept of total probability can be extended to expected values. That is, if {Ai}, i = 1,2,3, …, n is a set of mutually exclusive and exhaustive events, thenPost your question

0