Homework 2

Due 22 September, 2016

Give reasons for all answers.

  1. Suppose you want to encode a source that produces characters with the following distribution:
    P(a1) = 0.3, P(a2) = 0.2, P(a3) = 0.2, P(a4) = 0.1, P(a5) = 0.08, P(a6) = 0.06, P(a7) = 0.04, P(a8) = 0.02
    Design binary, ternary, and quaternary Huffman codes for this source and compare the average lengths to the entropy. Remember to use the appropriate base for the logarithms when calculating the entropies.
  2. Design a Tunstall code for a source that produces independent characters with probabilities P(a1) = 0.5, P(a2) = 0.3, P(a3) = 0.2 where the codewords are blocks of two ternary digits.
  3. In class we discussed how to encode an arbitrary integer using the ordinary binary representation of n preceded by a unary value giving the number of bits needed to represent n. For example, 1 is encoded as 01, 2 is encoded as 1010, 5 is encoded as 110101, etc. Find a probability distribution on the positive integers for which this is an optimal code. Try to find a simple expression for the probabilities.
  4. Use the Kraft-McMillan inequality to show that there is no encoding of the positive integers that uses fewer than log n bits to represent n for every n.