Question: Independence bound on Entropy . . . . . . . . . . . . . . . . . . . . .
Independence bound on Entropy
A consequence of the Chain Rule for Entropy is that if we have many
different random variables X;X; :::;Xn then the sum of all their
individual entropies must be an upper bound on their joint entropy:
HX;XXn Xn i HXi
Their joint entropy achieves this upper bound only if all of these n
random variables are independent.
Another upper bound to note is that for any single random variable X
which has N possible values, its entropy HX is maximised when all
of those values have the same probability pi N In that case,
HX
X
i
pi log pi XN N log N log N :
We shall use this fact when evaluating the eciency of coding schemes
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
