Worked Examples: Huffman Coding

These examples are designed to step through the logical process of applying Huffman coding concepts.

Example 1: Prefix Code Property
❓ Problem: Why can't codes A=0, B=1, C=01 be used together?
💡 Solution: C=01 has A=0 as a prefix. String '01' is ambiguous: is it 'AB' or 'C'? Prefix codes ensure no code is a prefix of another.
Example 2: Build Huffman Tree
❓ Problem: Build a Huffman tree for: A:8, B:3, C:1, D:1, E:2
💡 Solution: Merge order: (C:1, D:1)→2, (2, E:2)→4, (B:3, 4)→7, (7, A:8)→15. Codes: A=1, B=01, E=001, C=0000, D=0001.
Example 3: Calculate Expected Length
❓ Problem: Given: A:50%, B:25%, C:15%, D:10%. What is the expected code length?
💡 Solution: A:1 bit, B:2 bits, C:3 bits, D:3 bits. Expected = $0.5(1) + 0.25(2) + 0.15(3) + 0.1(3) = 1.75$ bits per symbol.
Example 4: Decode Huffman String
❓ Problem: Given tree with A=0, B=10, C=11. Decode: 01011100
💡 Solution: 0|10|11|10|0 → A B C B A
Example 5: Fixed vs Variable Comparison
❓ Problem: 5 chars with freqs 100, 50, 30, 15, 5. Compare fixed-length vs Huffman.
💡 Solution: Fixed: 3 bits × 200 = 600 bits. Huffman: ~430 bits. Savings ≈ 28%.
Example 6: Time Complexity
❓ Problem: What is the time complexity of building a Huffman tree for n symbols?
💡 Solution: We perform $n-1$ merges. Each merge: 2 Extract-Min + 1 Insert = $O(\log n)$ using a min-heap. Total: $O(n \log n)$.