Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Algoritmos Greedy

1,004 views

Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Algoritmos Greedy

  1. 1. Greedy Algorithms Faster Algorithms for Well-Behaved Optimization Problems
  2. 2. <ul><li>The algorithms we have studied are relatively inefficient: </li></ul><ul><li>Matrix chain multiplication: O ( n 3 ) </li></ul><ul><li>Longest common subsequence: O ( mn ) </li></ul><ul><li>Optimal binary search trees: O ( n 3 ) </li></ul><ul><li>Why? </li></ul><ul><li>We have many choices in computing an optimal solution. </li></ul><ul><li>We exhaustively (blindly) check all of them. </li></ul><ul><li>We would much rather like to have a way to decide which choice is the best or at least restrict the choices we have to try. </li></ul><ul><li>Greedy algorithms work for problems where we can decide what’s the best choice. </li></ul>Dynamic Programming is Blind
  3. 3. <ul><li>Greedy choice: </li></ul><ul><ul><li>We make the choice that looks best at the moment. (Every time we make a choice, we greedily try to maximize our profit.) </li></ul></ul><ul><li>An example where this does not work: </li></ul><ul><ul><li>Finding a longest monotonically increasing subsequence. </li></ul></ul><ul><ul><li>Given sequence ‹3 4 5 17 7 8 9›, </li></ul></ul><ul><ul><li>a longest monotonically increasing subsequence is ‹3 4 5 7 8 9›. The greedy choice after choosing ‹3 4 5› is to choose 17, which precludes the addition of another element and thus results in the suboptimal sequence ‹3 4 5 17›. </li></ul></ul>Making a Greedy Choice
  4. 4. <ul><li>i 1 2 3 4 5 6 7 8 9 10 11 </li></ul><ul><li>s i 1 3 0 5 3 5 6 8 8 2 12 </li></ul><ul><li>f i 4 5 6 7 8 9 10 11 12 13 14 </li></ul><ul><li>Subset of mutually compatible activities </li></ul><ul><li>{a 3 , a 9 ,a 11 } </li></ul><ul><li>{a 1 , a 4 , a 8 , a 11 } </li></ul><ul><li>{a2, a 4 , a 9 , a 11 } </li></ul>A Scheduling Problem 1 2 3 4 5 6 7 8
  5. 5. <ul><li>A classroom can be used for one class at a time. </li></ul><ul><li>There are n classes that want to use the classroom. </li></ul><ul><li>Every class has a corresponding time interval I j = [ s j , f j ) during which the the room would be needed for this class. </li></ul><ul><li>Our goal is to choose a maximal number of classes that can be scheduled to use the classroom without two classes ever using the classroom at the same time. </li></ul><ul><li>Assume that the classes are sorted according to increasing finish times; that is, </li></ul><ul><li>f 1 < f 2 < … < f n . </li></ul>A Scheduling Problem 1 2 3 4 5 6 7 8
  6. 6. <ul><li>Let S i , j be the set of classes that begin after time f i and end before time s j ; that is, these classes can be scheduled between classes C i and C j . </li></ul><ul><li>We can add two fictitious classes C 0 and C n + 1 with f 0 = –∞ and s n + 1 = +∞. Then S 0, n + 1 is the set of all classes. </li></ul><ul><li>Assume that class C k is part of an optimal schedule of the classes in S i , j . </li></ul><ul><li>Then i < k < j , and the optimal schedule consists of a maximal subset of S i , k , { C k }, and a maximal subset of S k , j . </li></ul>The Structure of an Optimal Schedule C j C i C k S i , k S k , j
  7. 7. <ul><li>Hence, if Q ( i , j ) is the size of an optimal schedule for set S i , j , we have </li></ul>The Structure of an Optimal Schedule
  8. 8. <ul><li>Lemma: There exists an optimal schedule for the set S i,j that contains the class C k in S i,j that finishes first, that is, the class C k in S i,j with minimal index k. </li></ul><ul><li>Lemma: If we choose class C k as in the previous lemma, then set S i,k is empty. </li></ul>Making a Greedy Choice
  9. 9. <ul><li>Recursive-Recursive-Selector(s,f,i,j) </li></ul><ul><li>1 m  i +1 </li></ul><ul><li>2 While m<j and s m <= f i </li></ul><ul><li>3 Do m  m +1 </li></ul><ul><li>4 If m < j 5 Then return {a m } U Recursive-Recursive-Selector(s,f,m,j) </li></ul><ul><li>6 Else return ∅ </li></ul>A Recursive Greedy Algorithm In addition, the algorithm has a certain overhead for maintaining the stack, because it is recursive.
  10. 10. <ul><li>Recursive-Schedule( S ) </li></ul><ul><li>1 if | S | = 0 </li></ul><ul><li>2 then return ∅ </li></ul><ul><li>3 Let C k be the class with minimal finish time in S </li></ul><ul><li>4 Remove C k and all classes that begin before C k ends from S ; let S' be the resulting set </li></ul><ul><li>5 O ← Recursive-Schedule( S' ) </li></ul><ul><li>6 return O ∪ { C k } </li></ul>A Recursive Greedy Algorithm Depending on the data structure we use to store S , this algorithm has running time O ( n 2 ) or O ( n log n ). In addition, the algorithm has a certain overhead for maintaining the stack, because it is recursive.
  11. 11. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  12. 12. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  13. 13. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  14. 14. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  15. 15. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  16. 16. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  17. 17. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  18. 18. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  19. 19. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  20. 20. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm
  21. 21. <ul><li>Iterative-Schedule( S ) </li></ul><ul><li>1 n ← | S | </li></ul><ul><li>2 m ← –∞ </li></ul><ul><li>3 O ← ∅ </li></ul><ul><li>3 for i = 1.. n </li></ul><ul><li>4 do if s i ≥ m </li></ul><ul><li>5 then O ← O ∪ { C i } </li></ul><ul><li>6 m ← f i </li></ul><ul><li>7 return O </li></ul>An Iterative Linear-Time Greedy Algorithm The running time of this algorithm is obviously linear. It’s correctness follows from the following lemma: Lemma: Let O be the current set of selected classes, and let C k be the last class added to O. Then any class C l , l > k, that conflicts with a class in O conflicts with C k .
  22. 22. <ul><li>Instead of maximizing the number of classes we want to schedule, we want to maximize the total time the classroom is in use. </li></ul><ul><li>Our dynamic programming approach still remains unchanged. </li></ul><ul><li>But none of the obvious greedy choices would work: </li></ul><ul><ul><li>Choose the class that starts earliest/latest </li></ul></ul><ul><ul><li>Choose the class that finishes earliest/latest </li></ul></ul><ul><ul><li>Choose tho longest class </li></ul></ul>A Variation of the Problem
  23. 23. <ul><li>The problem must have two properties: </li></ul><ul><li>Greedy choice property: </li></ul><ul><ul><li>An optimal solution can be obtained by making choices that seem best at the time, without considering their implications for solutions to subproblems. </li></ul></ul><ul><li>Optimal substructure: </li></ul><ul><ul><li>An optimal solution can be obtained by augmenting the partial solution constructed so far with an optimal solution of the remaining subproblem. </li></ul></ul>When Do Greedy Algorithms Produce an Optimal Solution?
  24. 24. <ul><li>Our next goal is to develop a code that represents a given text as compactly as possible. </li></ul><ul><li>A standard encoding is ASCII, which represents every character using 7 bits: </li></ul><ul><ul><li>“An English sentence” </li></ul></ul><ul><ul><li>1000001 (A) 1101110 (n) 0100000 ( ) 1000101 (E) 1101110 (n) 1100111 (g) </li></ul></ul><ul><ul><li>1101100 (l) 1101001 (i) 1110011 (s) 1101000 (h) 0100000 ( ) 1110011 (s) 1100101 (e) 1101110 (n) 1110100 (t) 1100101 (e) 1101110 (n) 1100011 (c) 1100101 (e) </li></ul></ul><ul><ul><li>= 133 bits ≈ 17 bytes </li></ul></ul>Text Encoding
  25. 25. <ul><li>Of course, this is wasteful because we can encode 12 characters in 4 bits: </li></ul><ul><li>‹space› = 0000 A = 0001 E = 0010 c = 0011 e = 0100 g = 0101 h = 0110 i = 0111 l = 1000 n = 1001 s = 1010 t = 1011 </li></ul><ul><li>Then we encode the phrase as </li></ul><ul><li>0001 (A) 1001 (n) 0000 ( ) 0010 (E) 1001 (n) 0101 (g) 1000 (l) 0111 (i) 1010 (s) 0110 (h) 0000 ( ) 1010 (s) 0100 (e) 1001 (n) 1011 (t) 0100 (e) 1001 (n) 0011 (c) 0100 (e) </li></ul><ul><li>This requires 76 bits ≈ 10 bytes </li></ul>
  26. 26. <ul><li>An even better code is given by the following encoding: </li></ul><ul><li>‹space› = 000 A = 0010 E = 0011 s = 010 c = 0110 g = 0111 h = 1000 i = 1001 l = 1010 t = 1011 e = 110 n = 111 </li></ul><ul><li>Then we encode the phrase as </li></ul><ul><li>0010 (A) 111 (n) 000 ( ) 0011 (E) 111 (n) 0111 (g) 1010 (l) 1001 (i) </li></ul><ul><li>010 (s) 1000 (h) 000 ( ) 010 (s) 110 (e) 111 (n) 1011 (t) 110 (e) 111 (n) 0110 (c) 110 (e) </li></ul><ul><li>This requires 65 bits ≈ 9 bytes </li></ul>
  27. 27. <ul><li>Fixed-length codes: </li></ul><ul><ul><li>Every character is encoded using the same number of bits. </li></ul></ul><ul><ul><li>To determine the boundaries between characters, we form groups of w bits, where w is the length of a character. </li></ul></ul><ul><ul><li>Examples: </li></ul></ul><ul><ul><li>ASCII </li></ul></ul><ul><ul><li>Our first improved code </li></ul></ul><ul><li>Prefix codes: </li></ul><ul><ul><li>No character is the prefix of another character. </li></ul></ul><ul><ul><li>Examples: </li></ul></ul><ul><ul><li>Fixed-length codes </li></ul></ul><ul><ul><li>Huffman codes </li></ul></ul>Codes That Can Be Decoded
  28. 28. <ul><li>Consider a code that is not a prefix code: a = 01 m = 10 n = 111 o = 0 r = 11 s = 1 t = 0011 </li></ul><ul><li>Now you send a fan-letter to you favorite movie star. One of the sentences is “You are a star.” You encode “star” as “1 0011 01 11”. </li></ul><ul><li>Your idol receives the letter and decodes the text using your coding table: 100110111 = 10 0 11 0 111 = “moron” Oops, you have just insulted your idol. </li></ul><ul><li>Non-prefix codes are ambiguous. </li></ul>Why Prefix Codes?
  29. 29. Why Are Prefix Codes Unambiguous? Since both c and c' can occur at the beginning of the text, we have x i = y i , for 0 ≤ i ≤ k ; that is, x 0 x 1 … x k is a prefix of y 0 y 2 … y l , a contradiction. It suffices to show that the first character can be decoded unambiguously. We then remove this character and are left with the problem of decoding the first character of the remaining text, and so on until the whole text has been decoded. c Assume that there are two characters c and c' that could potentially be the first characters in the text. Assume that the encodings are x 0 x 1 … x k and y 0 y 2 … y l . Assume further that k ≤ l . c c'
  30. 30. <ul><li>Our example: </li></ul><ul><li>‹space› = 000 A = 0010 E = 0011 s = 010 c = 0110 g = 0111 h = 1000 i = 1001 l = 1010 t = 1011 e = 110 n = 111 </li></ul>Representing a Prefix-Code Dictionary ‹ spc› A E s c g h i l t e n 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1
  31. 31. <ul><li>Our example: </li></ul><ul><li>‹space› = 000 A = 0010 E = 0011 s = 010 c = 0110 g = 0111 h = 1000 i = 1001 l = 1010 t = 1011 e = 110 n = 111 </li></ul>Representing a Prefix-Code Dictionary ‹ spc› A E s c g h i l t e n 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1
  32. 32. <ul><li>Let C be the set of characters in the text to be encoded, and let f ( c ) be the frequency of character c . </li></ul><ul><li>Let d T ( c ) be the depth of node (character) c in the tree representing the code. Then </li></ul>The Cost of Prefix Codes is the number of bits required to encode the text using the code represented by tree T . We call B ( T ) the cost of tree T . Observation: In a tree T representing an optimal prefix code, every internal node has two children.
  33. 33. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45
  34. 34. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14
  35. 35. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14
  36. 36. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25
  37. 37. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25
  38. 38. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25 30
  39. 39. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25 30
  40. 40. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25 30 55
  41. 41. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25 30 55
  42. 42. <ul><li>Huffman( C ) </li></ul><ul><li>1 n ← | C | </li></ul><ul><li>2 Q ← C </li></ul><ul><li>3 for i = 1.. n – 1 </li></ul><ul><li>4 do allocate a new node z </li></ul><ul><li>5 left[ z ] ← x ← Delete-Min( Q ) </li></ul><ul><li>6 right[ z ] ← y ← Delete-Min( Q ) </li></ul><ul><li>7 f [ z ] ← f [ x ] + f [ y ] </li></ul><ul><li>8 Insert( Q , z ) </li></ul><ul><li>9 return Delete-Min( Q ) </li></ul>Huffman’s Algorithm f:5 b:13 c:12 d:16 e:9 a:45 14 25 30 55 100 0 1 0 0 0 0 1 1 1 1 0 100 101 1100 1101 111
  43. 43. <ul><li>Why is merging the two nodes with smallest frequency into a subtree a greedy choice? </li></ul>Greedy Choice By merging the two nodes with lowest frequency, we greedily try to minimize the cost the new node contributes to B ( T ). where B ( v ) = 0 if v is a leaf and B ( v ) = f (left[ v ]) + f (right[ v ]) if v is an internal node. We can alternatively define B ( T ) as
  44. 44. <ul><li>Lemma: There exists an optimal prefix code such that the two characters with smallest frequency are siblings and have maximal depth in T. </li></ul>Greedy Choice Let x and y be two such characters, and let T be a tree representing an optimal prefix code. Let a and b be two sibling leaves of maximal depth in T . Assume w.l.o.g. that f ( x ) ≤ f ( y ) and f ( a ) ≤ f ( b ). This implies that f ( x ) ≤ f ( a ) and f ( y ) ≤ f ( b ). Let T' be the tree obtained by exchanging a and x and b and y . T T' x y b a x a y b
  45. 45. The cost difference between trees T and T' is T T' x y b a x a y b
  46. 46. <ul><li>After joining two nodes x and y by making them children of a new node z , the algorithm treats z as a leaf with frequency f ( z ) = f ( x ) + f ( y ). </li></ul><ul><li>Let C' be the character set in which x and y are replaced by the single character z with frequency f ( z ) = f ( x ) + f ( y ), and let T' be an optimal tree for C' . </li></ul><ul><li>Let T be the tree obtained from T' by making x and y children of z . </li></ul><ul><li>We observe the following relationship between B ( T ) and B ( T' ): </li></ul><ul><li>B ( T ) = B ( T' ) + f ( x ) + f ( y ) </li></ul>Optimal Substructure
  47. 48. Lemma: If T' is optimal for C', then T is optimal for C. Assume the contrary. Then there exists a better tree T'' for C . Also, there exists a tree T''' at least as good as T'' for C where x and y are sibling leaves of maximal depth. The removal of x and y from T''' turns their parent into a leaf; we can associate this leaf with z . The cost of the resulting tree is B ( T''' ) – f ( x ) – f ( y ) < B ( T ) – f ( x ) – f ( y ) = B ( T' ). This contradicts the optimality of B ( T' ). Hence, T must be optimal for C .
  48. 49. <ul><li>Greedy algorithms are efficient algorithms for optimization problems that exhibit two properties: </li></ul><ul><ul><li>Greedy choice property: An optimal solution can be obtained by making locally optimal choices. </li></ul></ul><ul><ul><li>Optimal substructure: An optimal solution contains within it optimal solutions to smaller subproblems. </li></ul></ul><ul><li>If only optimal substructure is present, dynamic programming may be a viable approach; that is, the greedy choice property is what allows us to obtain faster algorithms than what can be obtained using dynamic programming. </li></ul>Summary

×