Asymptotic notations

29,573 views

Published on

Published in: Education, Technology, Business
2 Comments
35 Likes
Statistics
Notes
No Downloads
Views
Total views
29,573
On SlideShare
0
From Embeds
0
Number of Embeds
107
Actions
Shares
0
Downloads
94
Comments
2
Likes
35
Embeds 0
No embeds

No notes for slide
  • There are actually 5 kinds of asymptotic notation. How many of you are familiar with all of these? What these symbols do is give us a notation for talking about how fast a function goes to infinity, which is just what we want to know when we study the running times of algorithms. Instead of working out a complicated formula for the exact running time, we can just say that the running time is theta of n^2. That is, the running time is proportional to n^2 plus lower order terms. For most purposes, that’s just what we want to know. One thing to keep in mind is that we’re working with functions defined on the natural numbers. Sometimes I’ll talk (a little) about doing calculus on these functions, but the piont is that we won’t care what, say, f(1/2) is.
  • Sometimes we won’t know the exact order of growth. Sometimes the running time depends on the input, or we might be talking about a number of different algorithms. Then we might want to put an upper or lower bound on the order of growth. That’s what big-O and big-Omega are for. Except for Theta, the thing to remember is that the English letters are upper bounds, and the Greek letters are lower bounds. (Theta is both, but it’s only a greek letter.) So O(g(n)) is the set of functions that go to infinity no faster than g. The formal definition is the same as for Theta, except that there is only one c, and you have the inequality. We call g an asymptotic . . .
  • In the same way, Omega(g(n)) is the set of functions that go to infinity no slower than g(n). Again, the definition is the same except that the inequality reads “0 le c g(n) le f(n)” for all n ge n0. Are there any questions?
  • Asymptotic notations

    1. 1. Introduction to Asymptotic Notations Presented By: Gaurav Mittal
    2. 2. Asymptotic Notation <ul><li> , O ,  , o ,  </li></ul><ul><li>Used to describe the running times of algorithms </li></ul><ul><li>Instead of exact running time, say  ( n 2 ) </li></ul><ul><li>Defined for functions whose domain is the set of natural numbers, N </li></ul><ul><li>Determine sets of functions, in practice used to compare two functions </li></ul>
    3. 3. Asymptotic Notation <ul><li>By now you should have an intuitive feel for asymptotic (big-O) notation: </li></ul><ul><ul><li>What does O(n) running time mean? O(n 2 )? O(n lg n)? </li></ul></ul><ul><li>Our first task is to define this notation more formally and completely </li></ul>
    4. 4. Big-O notation (Upper Bound – Worst Case) <ul><li>For a given function g(n) , we denote by O( g ( n )) the set of functions </li></ul><ul><ul><li>O( g ( n )) = { f(n) : there exist positive constants c >0 and n 0 >0 such that 0  f(n)  cg(n) for all n  n 0 } </li></ul></ul><ul><li>We say g(n) is an asymptotic upper bound for f(n) : </li></ul><ul><li>O(g(n)) means that as n   , the execution time f(n) is at most c.g(n) for some constant c </li></ul><ul><li>What does O(g(n)) running time mean? </li></ul><ul><ul><li>The worst-case running time (upper-bound) is a function of g(n) to a within a constant factor </li></ul></ul>
    5. 5. Big-O notation (Upper Bound – Worst Case) time n n 0 f(n) c . g(n) f(n) = O(g(n))
    6. 6. O -notation <ul><li>For a given function g ( n ), we denote by O ( g ( n )) the set of functions </li></ul><ul><li>O ( g ( n )) = { f ( n ): there exist positive constants c and n 0 such that </li></ul><ul><li>0  f ( n )  cg ( n ), </li></ul><ul><li>for all n  n 0 } </li></ul>We say g ( n ) is an asymptotic upper bound for f ( n )
    7. 7. Big-O notation (Upper Bound – Worst Case) <ul><li>This is a mathematically formal way of ignoring constant factors, and looking only at the “shape” of the function </li></ul><ul><li>f(n)=O(g(n)) should be considered as saying that “ f(n) is at most g(n) , up to constant factors ”. </li></ul><ul><li>We usually will have f(n) be the running time of an algorithm and g(n) a nicely written function </li></ul><ul><li>E.g. The running time of insertion sort algorithm is O(n 2 ) </li></ul>
    8. 8. Big-O notation (Upper Bound – Worst Case) <ul><li>Example1: Is 2n + 7 = O(n)? </li></ul><ul><li>Let </li></ul><ul><ul><li>T(n) = 2n + 7 </li></ul></ul><ul><ul><li>T(n) = n (2 + 7/n) </li></ul></ul><ul><ul><li>Note for n=7; </li></ul></ul><ul><ul><ul><li>2 + 7/n = 2 + 7/7 = 3 </li></ul></ul></ul><ul><ul><li>T(n)  3 n ;  n  7 n 0 </li></ul></ul><ul><ul><ul><ul><ul><li>c </li></ul></ul></ul></ul></ul><ul><ul><li>Then T(n) = O ( n ) </li></ul></ul><ul><ul><li>lim n  [ T(n) / n) ] = 2  0  T(n) = O(n) </li></ul></ul>
    9. 9. Big-O notation (Upper Bound – Worst Case) <ul><li>Example2: Is 5n 3 + 2n 2 + n + 10 6 = O(n 3 )? </li></ul><ul><li>Let </li></ul><ul><ul><li>T(n) = 5n 3 + 2n 2 + n + 10 6 </li></ul></ul><ul><ul><li>T(n) = n 3 (5 + 2/n + 1/n 2 + 10 6 /n 3 ) </li></ul></ul><ul><ul><li>Note for n=100; </li></ul></ul><ul><ul><ul><li>5 + 2/n + 1/n 2 + 10 6 /n 3 = </li></ul></ul></ul><ul><ul><ul><li>5 + 2/100 + 1/10000 + 1 = 6.05 </li></ul></ul></ul><ul><ul><li>T(n)  6.05 n 3 ;  n  100 n 0 </li></ul></ul><ul><ul><li> c </li></ul></ul><ul><ul><li>Then T(n) = O ( n 3 ) </li></ul></ul><ul><ul><li>lim n  [ T(n) / n 3 ) ] = 5  0  T(n) = O(n 3 ) </li></ul></ul>
    10. 10. Big-O notation (Upper Bound – Worst Case) <ul><li>Express the execution time as a function of the input size n </li></ul><ul><li>Since only the growth rate matters, we can ignore the multiplicative constants and the lower order terms, e.g., </li></ul><ul><ul><li>n, n+1, n+80, 40n, n+log n is O(n) </li></ul></ul><ul><ul><li>n 1.1 + 10000000000n is O(n 1.1 ) </li></ul></ul><ul><ul><li>n 2 is O(n 2 ) </li></ul></ul><ul><ul><li>3 n 2 + 6n + log n + 24.5 is O(n 2 ) </li></ul></ul><ul><li>O(1) < O(log n) < O((log n) 3 ) < O(n) < O(n 2 ) < O(n 3 ) < O(n log n ) < O(2 sqrt(n) ) < O(2 n ) < O(n!) < O(n n ) </li></ul><ul><li>Constant < Logarithmic < Linear < Quadratic< Cubic < Polynomial < Factorial < Exponential </li></ul>
    11. 11.  -notation (Omega) (Lower Bound – Best Case) <ul><li>For a given function g(n) , we denote by  ( g ( n )) the set of functions </li></ul><ul><ul><li> ( g ( n )) = { f(n) : there exist positive constants c >0 and n 0 >0 such that 0  cg(n)  f(n) for all n  n 0 } </li></ul></ul><ul><li>We say g(n) is an asymptotic lower bound for f(n) : </li></ul><ul><li> (g(n)) means that as n   , the execution time f(n) is at least c.g(n) for some constant c </li></ul><ul><li>What does  (g(n)) running time mean? </li></ul><ul><ul><li>The best-case running time (lower-bound) is a function of g(n) to a within a constant factor </li></ul></ul>
    12. 12.  -notation (Lower Bound – Best Case) time n n 0 c . g(n) f(n) f(n) =  (g(n))
    13. 13.  -notation <ul><li>For a given function g ( n ), we denote by  ( g ( n )) the set of functions </li></ul><ul><li> ( g ( n )) = { f ( n ): there exist positive constants c and n 0 such that </li></ul><ul><li>0  cg ( n )  f ( n ) </li></ul><ul><li>for all n  n 0 } </li></ul>We say g ( n ) is an asymptotic lower bound for f ( n )
    14. 14.  -notation (Omega) (Lower Bound – Best Case) <ul><li>We say Insertion Sort’s run time T(n) is  (n) </li></ul><ul><li>For example </li></ul><ul><ul><li>the worst-case running time of insertion sort is O(n 2 ), and </li></ul></ul><ul><ul><li>the best-case running time of insertion sort is  (n) </li></ul></ul><ul><ul><li>Running time falls anywhere between a linear </li></ul></ul><ul><ul><li>function of n and a quadratic function of n 2 </li></ul></ul>
    15. 15.  notation (Theta) (Tight Bound) <ul><li>In some cases, </li></ul><ul><ul><li>f(n) = O( g ( n )) and f(n) =  ( g ( n )) </li></ul></ul><ul><ul><li>This means, that the worst and best cases require the same amount of time t within a constant factor </li></ul></ul><ul><ul><li>In this case we use a new notation called “theta  ” </li></ul></ul><ul><li>For a given function g(n) , we denote by  ( g ( n )) the set of functions </li></ul><ul><ul><li> ( g ( n )) = { f(n) : there exist positive constants c 1 >0 , c 2 >0 and n 0 >0 such that </li></ul></ul><ul><ul><ul><li>c 1 g(n)  f(n)  c 2 g(n)  n  n 0 } </li></ul></ul></ul>
    16. 16.  notation (Theta) (Tight Bound) <ul><li>We say g(n) is an asymptotic tight bound for f(n) : </li></ul><ul><li>Theta notation </li></ul><ul><ul><li> (g(n)) means that as n   , the execution time f(n) is at most c 2 .g(n) and at least c 1 .g(n) for some constants c 1 and c 2 . </li></ul></ul><ul><li>f(n) =  (g(n)) if and only if </li></ul><ul><ul><li>f(n) = O(g(n)) & f(n) =  (g(n)) </li></ul></ul>
    17. 17.  notation (Theta) (Tight Bound) time n n 0 c 1 . g(n) f(n) f(n) =  (g(n)) c 2 . g(n)
    18. 18. Conclusion <ul><li>Asymptotic notations describes the efficiency and performance of algorithm in a meaningful way. </li></ul><ul><li>These notations indirectly let us decide which algorithm is better and more efficient. </li></ul>

    ×