Question: Coding & Information Theory 4. (18%) (Entropy and Shannon-Fano Code) Given a sources with probabilities P: = 0.3, 0.2, 0.15, 0.15, 0.1, 0.1. (1) Calculate
Coding & Information Theory

4. (18%) (Entropy and Shannon-Fano Code) Given a sources with probabilities P: = 0.3, 0.2, 0.15, 0.15, 0.1, 0.1. (1) Calculate the entropy of S. (2) Find the word-lengths, average word-length, and efficiency of a binary Shannon-Fano code for S. (3) Let S have q equiprobable symbols. Find the average word-length L, of an r-ary Shannon- Fano code for 5", and verify that -n - H,() as n . 5. (18%) (Binary Symmetric Cannel) Let I be the BSC. Its input alphabet A = Z2 ={0,1} and its output alphabet B = Z2 ={0,1}. Its probabilities have the form Po = Pr(a = 0) = p and p. = Pr(a = 1) = p =1-p for some p such that 0 sp $ 1. Its channel matrix has the form Poo Po1 M= PLO PL for some p where 0 SPS 1, where Pij is forward probability. 4. (18%) (Entropy and Shannon-Fano Code) Given a sources with probabilities P: = 0.3, 0.2, 0.15, 0.15, 0.1, 0.1. (1) Calculate the entropy of S. (2) Find the word-lengths, average word-length, and efficiency of a binary Shannon-Fano code for S. (3) Let S have q equiprobable symbols. Find the average word-length L, of an r-ary Shannon- Fano code for 5", and verify that -n - H,() as n . 5. (18%) (Binary Symmetric Cannel) Let I be the BSC. Its input alphabet A = Z2 ={0,1} and its output alphabet B = Z2 ={0,1}. Its probabilities have the form Po = Pr(a = 0) = p and p. = Pr(a = 1) = p =1-p for some p such that 0 sp $ 1. Its channel matrix has the form Poo Po1 M= PLO PL for some p where 0 SPS 1, where Pij is forward probability
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
