What is the information entropy of the results of flipping a biased coin 1000 times, if the

Question:

What is the information entropy of the results of flipping a biased coin 1000 times, if the coin comes up tails with probability 5/6 and heads with probability 1/6? For this weighted coin, can you find an encoding for sequential pairs of coin flips (e.g. HH, HT, etc.) in terms of sequences of bits (possibly different numbers of bits for different possible pairs) that takes fewer than 2 bits on average for each pair of flips? How many bits on average does your encoding take, and how close is it to the information-theoretic limit?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

The Physics of Energy

ISBN: 978-1107016651

1st edition

Authors: Robert L. Jaffe, Washington Taylor

Question Posted: