Homework 1
Due 15 September, 2016
Give reasons for all answers.
- Suppose a source outputs a stream of bits with the property that a zero
is produced with probability 0.9 if the previous bit was a zero and with probability
0.2 if the previous bit was a 1.
- In a long stream of bits from this source, what is the proportion of
zeroes and ones? Note that Example 2.3.1 in the text is mostly correct,
except that the third equation should read: H = (30/31) log (31/30) +
(1/31) log 31 = 0.206 bits (approximately). Since the formula calculates
bits, the logarithms are to the base 2. To calculate the base 2 logarithm
you can use the formula log2x = (ln x) / ln 2.
- What is the entropy of this source?
- What is the entropy of a source that produces independent bits with
the same proportion of zeroes and ones?
- Design optimal codes for blocks of two bits from this source and from
the independent bit source.
- What is the performance (average number of output bits per input bit)
for each of these codes applied to both of these sources?
- How do these rates compare to the entropies?
- As discussed in Section 3.2.1, there may be more than one optimal code for
some distributions. Suppose a source produces six letters independently in
the ratio 9:9:3:3:2:1.
- What is the entropy of this source?
- Find as many substantially different optimal codes for this source as
you can. Two codes are substantially different if the multiset of codeword
lengths are different.
- Verify that the average lengths of these codes are the same.
- Find another possible set of codeword lengths by calculating (- log2P(a)
) for each letter and rounding up to an integer. How does the average
length of this code compare to the average length of the optimal code
and to the entropy?