Your SlideShare is downloading. ×
Asymptotic analysiz
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Asymptotic analysiz

582
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
582
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
33
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Adnan Shahzada
  • 2. • As mentioned in lecture 1, growth rates of algorithms are important: • how fast do resources needed for an algorithm grow with respect to the size of the problem? Adnan Shahzada• What we are primarily interested in is the asymptotic growth rate • how fast the resources grow when the problem size is extremely large • more specifically, the ratio of computational resources for problems of size n and n+1 as n  
  • 3. • The reason why we are interested in asymptotic growth rates is for two reasons:• For practical purposes, large problems are when we expect to have big computational requirements Adnan Shahzada• For theoretical purposes, concentrating on growth rates frees us from some important issues: • fixed costs (e.g. switching the computer on!), which may dominate for a small problem size but be largely irrelevant • machine and implementation details • The growth rate will be a compact and easy to understand function
  • 4. • These notations are used to describe the asymptotic running time of an algorithm.• Asymptotic Time Complexity: Adnan Shahzada The limiting behavior of the execution time of an algorithm when the size of the problem goes to infinity.• Asymptotic Space Complexity” The limiting behavior of the use of memory space of an algorithm when the size of the problem goes to infinity.
  • 5. • Consider two functions • t(n): 0 • f(n): 0 Adnan Shahzada• We say that t(n) is in the order of f(n) if: • t(n) is bounded above by f(n) for all sufficiently large n • i.e. t(n)  cf(n), where n n  n0 • where c is a positive constant • n0 is an integer threshold
  • 6. • If t(n) is in the order of f(n) we write this as: • t(n) is in O(f(n))• Often you will see things written like: Adnan Shahzada • t(n) = O(f(n))• Strictly speaking this use of the equals sign is incorrect • the relationship is a set inclusion, not an equality • t(n)  O(f(n)) is better
  • 7. • Example1: 2n + 10 is O(n) • 2n + 10  cn • (c  2) n  10 • n  10/(c  2) Adnan Shahzada • Pick c = 3 and n0 = 10• Example2: 3n3 + 20n2 + 5 is O(n3) — need c > 0 and n0  1 such that: — 3n3 + 20n2 + 5  c•n3 for n  n0 — this is true for c = 4 and n0 = 20
  • 8. • Rules: 1.Drop lower-order terms 2.Drop constant factors Adnan Shahzada• Use the smallest possible class of functions • Say “2n is O(n)” instead of “2n is O(n2)”• Use the simplest expression of the class • Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
  • 9. • The maximum rule states that• O(f(n)+g(n)) = O( max(f(n),g(n)) )• The term with the largest order dominates• Proof: Adnan Shahzada f(n)+g(n) = max(f(n),g(n)) + min(f(n),g(n)) f(n)+g(n)  2max(f(n),g(n)) • The constant of 2 can be easily
  • 10. • Seven functions that often appear in algorithm analysis: • Constant  1 • Logarithmic  log n • Linear  n Adnan Shahzada • N-Log-N  n log n • Quadratic  n2 • Cubic  n3 • Exponential  2n• Examples??
  • 11. • c (constant)• log n•n Adnan Shahzada• nlog n• nk (obviously n2 is lower than n3 and so on)• kn• n!• nn
  • 12. Input (1) log n n n log n n² n³ 2ⁿSize: n5 1 3 5 15 25 125 32 Adnan Shahzada10 1 4 10 33 100 10³ 10³100 1 7 100 664 104 106 10301000 1 10 1000 104 106 109 1030010000 1 13 10000 105 108 1012 103000
  • 13. • Big-O represents an upper bound on algorithmic cost• Big- represents a lower bound on algorithmic cost• Def: t(n)  (f(n))  t(n)  cf(n), where n n  n0 Adnan Shahzada• 3n +2 =  (n) , with c = 2 and n0 = 2• 3n3 +3n -1 is (n3), with c=2 and n0 =2• Note the duality rule: t(n)(f(n))  f(n)O(t(n))
  • 14. -Notation• Asymptotic Tight Bound• A function f(n) is (g(n)) if there exist positive constants c1 , c2 , and n0 such that • 0  c1g(n)  f(n)  c2 g(n) Adnan Shahzada• When a problem is (n), this represents both an upper and lower bound • i.e. it is O(n) and (n) • we say there is no algorithmic gap• Is 3n + 2 = (n) ??
  • 15. • f(n) = o(g(n)) means f(n) becomes insignificant relative to g(n) as n approaches to infinity. Adnan Shahzada• f(n) = o(g(n)) means for all c > 0 there exists some n0 > 0 such that 0 <=f(n) < cg(n) for all n>n0• Is 7n = o(n2) ?
  • 16. • f(n) = ω(g(n)) means that for any positive constant c>, there exists a constant n0>0, such that 0 <= cg(n) < f(n) for all n>=n0• f(n) = o(g(n)) means for all c > 0 there exists some n0 > 0 such that 0 <=f(n) < cg(n) for all n>n0 Adnan Shahzada• Is 7n2 = ω(n) ?
  • 17. • If two algorithms have the same asymptotic complexity say O(n2), will the execution time of the two algorithms always be same? Adnan Shahzada• How to select between the two algorithms having the same asymptotic performance?
  • 18. • A common misconception is that worst case running time is somehow defined by big-Oh, and that best case is defined by big-Omega.• There is no formal relationship like this. Adnan Shahzada• However, worst case and big-Oh are commonly used together, because they are both techniques for finding an upper bound on running time.