The Shannon entropy measures the information content of an input string and plays a cornerstone role in

Question:

The Shannon entropy measures the information content of an input string and plays a cornerstone role in information theory and data compression. Given a string of \(n\) characters, let \(f_{c}\) be the frequency of occurrence of character \(c\). The quantity \(p_{c}=f_{c} / n\) is an estimate of the probability that c would be in the string if it were a random string, and the entropy is defined to be the sum of the quantity \(-p_{c} \log _{2} p_{c}\), over all characters that appear in the string. The entropy is said to measure the information content of a string: if each character appears the same number times, the entropy is at its minimum value among strings of a given length. Write a program that takes the name of a file as a command-line argument and prints the entropy of the text in that file. Run your program on a web page that you read regularly, a recent paper that you wrote, and the fruit fly genome found on the website.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: