Question: Huffman coding reduces the average number of bits / message by using shorter codewords for more frequently - occurring messages, and longer codewords for less

Huffman coding reduces the average number of bits/message by using shorter codewords for
more frequently-occurring messages, and longer codewords for less frequently-occurring messages.
In this assignment you will apply probability and information theory to data compression.
Assume a source sends five messages, selected independently of each other, with the following
probabilities:
P(M1)=0.5
P(M2)=0.2
P(M3)=0.1
P(M4)=0.1
P(M5)=0.1
Answer these questions and show all of your work:
What is the information content of message M1, in bits?
What is the information content of message M2, in bits?
What is the information content of message M3, in bits?
What is the information content of message M4, in bits?
What is the information content of message M5, in bits?
What is the average number of information bits per message (entropy of the source)?
How many codebits per message are required when encoding each message using the shortest-
possible fixed-length binary code?
What is the average number of codebits per message resulting from using a Huffman code
to assign one variable-length codeword per message?
What is the average number of codebits per message resulting from using a Huffman code
to assign one variable-length codeword per message pair?
 Huffman coding reduces the average number of bits/message by using shorter

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!