Question: Independence bound on Entropy . . . . . . . . . . . . . . . . . . . . .

Independence bound on Entropy
...................................
A consequence of the Chain Rule for Entropy is that if we have many
different random variables X1;X2; :::;Xn, then the sum of all their
individual entropies must be an upper bound on their joint entropy:
H(X1;X2,,,,Xn)<= Xn i=1 H(Xi )
Their joint entropy achieves this upper bound only if all of these n
random variables are independent.
Another upper bound to note is that for any single random variable X
which has N possible values, its entropy H(X) is maximised when all
of those values have the same probability pi =1/N. In that case,
H(X)=
X
i
pi log2 pi = XN 11N log21 N= log2 N :
We shall use this fact when evaluating the eciency of coding schemes

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!