Homework 1

Due 15 September, 2016

Give reasons for all answers.

  1. Suppose a source outputs a stream of bits with the property that a zero is produced with probability 0.9 if the previous bit was a zero and with probability 0.2 if the previous bit was a 1.
    1. In a long stream of bits from this source, what is the proportion of zeroes and ones? Note that Example 2.3.1 in the text is mostly correct, except that the third equation should read: H = (30/31) log (31/30) + (1/31) log 31 = 0.206 bits (approximately). Since the formula calculates bits, the logarithms are to the base 2. To calculate the base 2 logarithm you can use the formula log2x = (ln x) / ln 2.
    2. What is the entropy of this source?
    3. What is the entropy of a source that produces independent bits with the same proportion of zeroes and ones?
    4. Design optimal codes for blocks of two bits from this source and from the independent bit source.
    5. What is the performance (average number of output bits per input bit) for each of these codes applied to both of these sources?
    6. How do these rates compare to the entropies?
  2. As discussed in Section 3.2.1, there may be more than one optimal code for some distributions. Suppose a source produces six letters independently in the ratio 9:9:3:3:2:1.
    1. What is the entropy of this source?
    2. Find as many substantially different optimal codes for this source as you can. Two codes are substantially different if the multiset of codeword lengths are different.
    3. Verify that the average lengths of these codes are the same.
    4. Find another possible set of codeword lengths by calculating (- log2P(a) ) for each letter and rounding up to an integer. How does the average length of this code compare to the average length of the optimal code and to the entropy?