Compressibility Example
Asked Answered
W

1

9

From my algorithms textbook:

The annual county horse race is bringing in three thoroughbreds who have never competed against one another. Excited, you study their past 200 races and summarize these as probability distributions over four outcomes: first (“first place”), second, third, and other.

                       Outcome     Aurora   Whirlwind    Phantasm
                        first        0.15      0.30          0.20

                        second       0.10      0.05          0.30

                        third        0.70      0.25          0.30

                        other        0.05      0.40          0.20

Which horse is the most predictable? One quantitative approach to this question is to look at compressibility. Write down the history of each horse as a string of 200 values (first, second, third, other). The total number of bits needed to encode these track-record strings can then be computed using Huffman’s algorithm. This works out to 290 bits for Aurora, 380 for Whirlwind, and 420 for Phantasm (check it!). Aurora has the shortest encoding and is therefore in a strong sense the most predictable.

How did they get 420 for Phantasm? I keep getting 400 bytes, as so:

Combine first, other = 0.4, combine second, third = 0.6. End up with 2 bits encoding each position.

Is there something I've misunderstood about the Huffman encoding algorithm?

Textbook available here: http://www.cs.berkeley.edu/~vazirani/algorithms.html (page 156).

Wellmannered answered 10/6, 2010 at 13:39 Comment(1)
"Which horse is most predictable?" - this doesn't actually answer that, because the placing depends on the other horses in the race. Aurora might run the course in exactly the same time every time - down to the millisecond! - and still get the results shown there because of the other horses in the race.Swainson
O
5

I think you're right: Phantasm's 200 outcomes can be represented using 400 bits (not bytes). 290 for Aurora and 380 for Whirlwind are correct.

The correct Huffman code is generated in the following manner:

  1. Combine the two least probable outcomes: 0.2 and 0.2. Get 0.4.
  2. Combine the next two least probable outcomes: 0.3 and 0.3. Get 0.6.
  3. Combine 0.4 and 0.6. Get 1.0.

You would get 420 bits if you did this instead:

  1. Combine the two least probable outcomes: 0.2 and 0.2. Get 0.4.
  2. Combine 0.4 and 0.3. (Wrong!) Get 0.7.
  3. Combine 0.7 and 0.3. Get 1.0
Onaonager answered 10/6, 2010 at 16:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.