Your SlideShare is downloading. ×
02 asymptotic-notation-and-recurrences
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

02 asymptotic-notation-and-recurrences

731

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
731
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
63
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Introduction to Algorithms 6.046J/18.401J LECTURE 2 Asymptotic Notation • O-, -, and -notation Recurrences • Substitution method • Iterating the recurrence • Recursion tree • Master method Prof. Erik DemaineSeptember 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.1
  • 2. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.2
  • 3. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0.EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.3
  • 4. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0.EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2) functions, not valuesSeptember 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.4
  • 5. Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0.EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2) funny, “one-way” functions, equality not valuesSeptember 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.5
  • 6. Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0 }September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.6
  • 7. Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0 }EXAMPLE: 2n2  O(n3)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.7
  • 8. Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0, n0 > 0 such that 0  f(n)  cg(n) for all n  n0 }EXAMPLE: 2n2  O(n3)(Logicians: n.2n2  O(n.n3), but it’sconvenient to be sloppy, as long as weunderstand what’s really going on.)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.8
  • 9. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.9
  • 10. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.EXAMPLE: f(n) = n3 + O(n2) means f(n) = n3 + h(n) for some h(n)  O(n2) .September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.10
  • 11. Macro substitutionConvention: A set in a formula representsan anonymous function in the set.EXAMPLE: n2 + O(n) = O(n2) means for any f(n)  O(n): n2 + f(n) = h(n) for some h(n)  O(n2) .September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.11
  • 12. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2).September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.12
  • 13. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2).  g(n)) = { f(n) : there exist constants c > 0, n0 > 0 such that 0  cg(n)  f(n) for all n  n0 }September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.13
  • 14. -notation (lower bounds)O-notation is an upper-bound notation. Itmakes no sense to say f(n) is at least O(n2).  g(n)) = { f(n) : there exist constants c > 0, n0 > 0 such that 0  cg(n)  f(n) for all n  n0 } EXAMPLE: n  (lg n) (c = 1, n0 = 16)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.14
  • 15. -notation (tight bounds)  g(n)) = O(g(n))  (g(n))September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.15
  • 16. -notation (tight bounds)  g(n)) = O(g(n))  (g(n)) 1 2 2 EXAMPLE: 2 n  2 n  ( n )September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.16
  • 17. -notation and -notationO-notation and -notation are like  and .o-notation and -notation are like < and >. g(n)) = { f(n) : for any constant c > 0, there is a constant n0 > 0 such that 0  f(n) cg(n) for all n  n0 } EXAMPLE: 2n2 = o(n3) (n0 = 2/c)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.17
  • 18. -notation and -notationO-notation and -notation are like  and .o-notation and -notation are like < and >. g(n)) = { f(n) : for any constant c > 0, there is a constant n0 > 0 such that 0  cg(n) f(n) for all n  n0 } EXAMPLE: n   (lg n) (n0 = 1+1/c)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.18
  • 19. Solving recurrences • The analysis of merge sort from Lecture 1 required us to solve a recurrence. • Recurrences are like solving integrals, differential equations, etc.  Learn a few tricks. • Lecture 3: Applications of recurrences to divide-and-conquer algorithms.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.19
  • 20. Substitution method The most general method: 1. Guess the form of the solution. 2. Verify by induction. 3. Solve for constants.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.20
  • 21. Substitution method The most general method: 1. Guess the form of the solution. 2. Verify by induction. 3. Solve for constants. EXAMPLE: T(n) = 4T(n/2) + n • [Assume that T(1) = (1).] • Guess O(n3) . (Prove O and separately.) • Assume that T(k)  ck3 for k < n . • Prove T(n)  cn3 by induction.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.21
  • 22. Example of substitutionT (n)  4T (n / 2)  n  4c(n / 2)3  n  (c / 2)n3  n  cn3  ((c / 2)n3  n) desired – residual  cn3 desiredwhenever (c/2)n3 – n  0, forexample, if c  2 and n  1. residualSeptember 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.22
  • 23. Example (continued) • We must also handle the initial conditions, that is, ground the induction with base cases. • Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. • For 1  n < n0, we have “ (1)”  cn3, if we pick c big enough.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.23
  • 24. Example (continued) • We must also handle the initial conditions, that is, ground the induction with base cases. • Base: T(n) = (1) for all n < n0, where n0 is a suitable constant. • For 1  n < n0, we have “ (1)”  cn3, if we pick c big enough. This bound is not tight!September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.24
  • 25. A tighter upper bound?We shall prove that T(n) = O(n2).September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.25
  • 26. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k)  ck2 for k < n:T (n)  4T (n / 2)  n  4c ( n / 2) 2  n  cn 2  n 2  O(n )September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.26
  • 27. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k)  ck2 for k < n:T (n)  4T (n / 2)  n  4c ( n / 2) 2  n  cn 2  n 2  O(n ) Wrong! We must prove the I.H.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.27
  • 28. A tighter upper bound?We shall prove that T(n) = O(n2).Assume that T(k)  ck2 for k < n:T (n)  4T (n / 2)  n  4c ( n / 2) 2  n  cn 2  n 2  O(n ) Wrong! We must prove the I.H.  cn 2  ( n) [ desired – residual ]  cn 2 for no choice of c > 0. Lose!September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.28
  • 29. A tighter upper bound!IDEA: Strengthen the inductive hypothesis.• Subtract a low-order term.Inductive hypothesis: T(k)  c1k2 – c2k for k < n. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.29
  • 30. A tighter upper bound!IDEA: Strengthen the inductive hypothesis.• Subtract a low-order term.Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.30
  • 31. A tighter upper bound!IDEA: Strengthen the inductive hypothesis.• Subtract a low-order term.Inductive hypothesis: T(k)  c1k2 – c2k for k < n. T(n) = 4T(n/2) + n = 4(c1(n/2)2 – c2(n/2)) + n = c1n2 – 2c2n + n = c1n2 – c2n – (c2n – n)  c1n2 – c2n if c2  1.Pick c1 big enough to handle the initial conditions. September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.31
  • 32. Recursion-tree method• A recursion tree models the costs (time) of a recursive execution of an algorithm.• The recursion-tree method can be unreliable, just like any method that uses ellipses (…).• The recursion-tree method promotes intuition, however.• The recursion tree method is good for generating guesses for the substitution method.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.32
  • 33. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.33
  • 34. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: T(n) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.34
  • 35. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 T(n/4) T(n/2) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.35
  • 36. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/2)2 T(n/16) T(n/8) T(n/8) T(n/4) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.36
  • 37. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/2)2 (n/16)2 (n/8)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.37
  • 38. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 n2 (n/4)2 (n/2)2 (n/16)2 (n/8)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.38
  • 39. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/2)2 16 (n/16)2 (n/8)2 (n/8)2 (n/4)2 (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.39
  • 40. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/2)2 16 25 n 2 (n/16)2 (n/8)2 (n/8)2 (n/4)2 256 … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.40
  • 41. Example of recursion treeSolve T(n) = T(n/4) + T(n/2) + n2: n2 n2 5 n2 (n/4)2 (n/2)2 16 25 n 2 (n/16)2 (n/8)2 (n/8)2 (n/4)2 256 … (1) Total = n 2  5  5 2 1  16 16      5 3 16  = (n2) geometric series September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.41
  • 42. The master methodThe master method applies to recurrences ofthe form T(n) = a T(n/b) + f (n) ,where a  1, b > 1, and f is asymptoticallypositive.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.42
  • 43. Three common casesCompare f (n) with nlogba:1. f (n) = O(nlogba – ) for some constant > 0. • f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.43
  • 44. Three common casesCompare f (n) with nlogba:1. f (n) = O(nlogba – ) for some constant > 0. • f (n) grows polynomially slower than nlogba (by an n factor). Solution: T(n) = (nlogba) .2. f (n) = (nlogba lgkn) for some constant k  0. • f (n) and nlogba grow at similar rates. Solution: T(n) = (nlogba lgk+1n) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.44
  • 45. Three common cases (cont.)Compare f (n) with nlogba:3. f (n) = (nlogba + ) for some constant > 0. • f (n) grows polynomially faster than nlogba (by an n factor), and f (n) satisfies the regularity condition that a f (n/b)  c f (n) for some constant c < 1. Solution: T(n) = ( f (n)) . September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.45
  • 46. ExamplesEX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n. CASE 1: f (n) = O(n2 – ) for = 1.  T(n) = (n2). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.46
  • 47. ExamplesEX. T(n) = 4T(n/2) + n a = 4, b = 2  nlogba = n2; f (n) = n. CASE 1: f (n) = O(n2 – ) for = 1.  T(n) = (n2).EX. T(n) = 4T(n/2) + n2 a = 4, b = 2  nlogba = n2; f (n) = n2. CASE 2: f (n) = (n2lg0n), that is, k = 0.  T(n) = (n2lg n). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.47
  • 48. ExamplesEX. T(n) = 4T(n/2) + n3 a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.48
  • 49. ExamplesEX. T(n) = 4T(n/2) + n3 a = 4, b = 2  nlogba = n2; f (n) = n3. CASE 3: f (n) = (n2 + ) for = 1 and 4(n/2)3  cn3 (reg. cond.) for c = 1/2.  T(n) = (n3).EX. T(n) = 4T(n/2) + n2/lg n a = 4, b = 2  nlogba = n2; f (n) = n2/lg n. Master method does not apply. In particular, for every constant > 0, we have n (lg n). September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.49
  • 50. Idea of master theorem Recursion tree: a f (n) f (n/b) f (n/b) … f (n/b) a f (n/b2) f (n/b2) … f (n/b2) (1)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.50
  • 51. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b) a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) … (1)September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.51
  • 52. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b)h = logbn a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) … (1) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.52
  • 53. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b)h = logbn a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) #leaves = ah … (1) = alogbn nlogba (1) = nlogba September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.53
  • 54. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b)h = logbn a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) CASE 1: The weight increases … geometrically from the root to the (1) leaves. The leaves hold a constant nlogba (1) fraction of the total weight. (nlogba) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.54
  • 55. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b)h = logbn a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) … CASE 2: (k = 0) The weight (1) is approximately the same on nlogba (1) each of the logbn levels. (nlogbalg n) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.55
  • 56. Idea of master theorem Recursion tree: f (n) f (n) a f (n/b) f (n/b) … f (n/b) a f (n/b)h = logbn a f (n/b2) f (n/b2) … f (n/b2) a2 f (n/b2) CASE 3: The weight decreases … geometrically from the root to the (1) leaves. The root holds a constant nlogba (1) fraction of the total weight. ( f (n)) September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.56
  • 57. Appendix: geometric series 1  x n 1 1  x  x2    xn  for x  1 1 x 1 2 1 x  x   for |x| < 1 1 x Return to last slide viewed.September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.57

×