Question: Simple optimum compression of a Markov source. Consider the four-state Markov process U1,U2, having transition matrix Thus, the probability that S4 follows S2 is equal

Simple optimum compression of a Markov source. Consider the four-state Markov process U1,U2, having transition matrix Thus, the probability that S4 follows S2 is equal to zero. Design four codes C1,C2,C3,C4 (one for each state 1, 2, 3, and 4, each code mapping elements of the set of Si 's into sequences of 0's and 1's, such that this Markov process can be sent with maximal compression by the following scheme: (a) Note the present symbol Xn=i. (b) Select code Ci. (c) Note the next symbol Xn+1=j and send the codeword in Ci corresponding to j. eat for the next symbol. What is the average message length of the next symbol conditioned the previous state Xn=i using this coding scheme? What is the unconditional average aber of bits per source symbol? Relate this to the entropy rate H(U) of the Markov chain Simple optimum compression of a Markov source. Consider the four-state Markov process U1,U2, having transition matrix Thus, the probability that S4 follows S2 is equal to zero. Design four codes C1,C2,C3,C4 (one for each state 1, 2, 3, and 4, each code mapping elements of the set of Si 's into sequences of 0's and 1's, such that this Markov process can be sent with maximal compression by the following scheme: (a) Note the present symbol Xn=i. (b) Select code Ci. (c) Note the next symbol Xn+1=j and send the codeword in Ci corresponding to j. eat for the next symbol. What is the average message length of the next symbol conditioned the previous state Xn=i using this coding scheme? What is the unconditional average aber of bits per source symbol? Relate this to the entropy rate H(U) of the Markov chain
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
