Question: ECE 645 only] A Huffman code finds the optimal codeword to assign to a given block of N source symbols. a) Show that 01, 100,

 ECE 645 only] A Huffman code finds the optimal codeword to

ECE 645 only] A Huffman code finds the optimal codeword to assign to a given block of N source symbols. a) Show that 01, 100, 101, 1110, 1111,0011,0001) cannot be a Huffman code for any N for any source distribution where every string to be coded has non-zero probability. (b) For a source producing an IID sequence of discrete random variables, each drawn from source alphabet X, it has been found that a Huffman code on blocks of length 2 (i.e. N 2 source symbols are taken at a time) has rate 2 bits/symbol, and a Huffman code on blocks of length 3 (i.e. N- 3 source symbols are taken at a time) has rate 1.6 bits/symbol. Find upper and lower bounds to the first-order entropy of the source. What can be deduced about the size of the source alphabet? (c) Does there exist a source producing an IID sequence of discrete random variables, and integers N {1, 2, 3, 4, . . .} and M E {2, 3, 4, . . .), such that a Huffman code on blocks of length N has (strictly) smaller rate (in bits/symbol) than a Huffman code on blocks of length MN? ECE 645 only] A Huffman code finds the optimal codeword to assign to a given block of N source symbols. a) Show that 01, 100, 101, 1110, 1111,0011,0001) cannot be a Huffman code for any N for any source distribution where every string to be coded has non-zero probability. (b) For a source producing an IID sequence of discrete random variables, each drawn from source alphabet X, it has been found that a Huffman code on blocks of length 2 (i.e. N 2 source symbols are taken at a time) has rate 2 bits/symbol, and a Huffman code on blocks of length 3 (i.e. N- 3 source symbols are taken at a time) has rate 1.6 bits/symbol. Find upper and lower bounds to the first-order entropy of the source. What can be deduced about the size of the source alphabet? (c) Does there exist a source producing an IID sequence of discrete random variables, and integers N {1, 2, 3, 4, . . .} and M E {2, 3, 4, . . .), such that a Huffman code on blocks of length N has (strictly) smaller rate (in bits/symbol) than a Huffman code on blocks of length MN

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!