2. Rate of Growth or Order of Growth
● Rate/Order of growth – Considering the leading term of a
formula
● Ignoring the lower order terms - insignificant for large
values of n
● Ignoring leading term’s constant coefficient – constant
factors are less significant
3. Rate of Growth or Order of Growth
►T (n) = an2 + bn + c
►we say T (n) is Ө(n2) (“theta of n-squared”)
• simplifying abstraction
• consider only the leading term of a formula
• lower order terms- relatively insignificant for
large values of n
• ignore the leading term’s constant coefficients
4. Asymptotic Efficiency
►input size is large enough, only the order of growth is
relevant
►asymptotic efficiency - how the running time
increases with the size of the input in the limit
►asymptotically more efficient - best choice for all
but very small inputs
5. Asymptotic Notations
►Domain of functions - Set of Natural Numbers
N = 0, 1, 2.... (as given by CLRS [1])
• T (n) usually defined only on integer input sizes
►T (n) = an2 + bn + c
•T (n) is Ө(n2) (“theta of n-squared”)
•T (n) is O(n2) (“Big - Oh of n-squared”)
•T (n) is Ω(n2) (“Omega of n-squared”)
6. O notation (big-Oh)
► T (n) = n2 + 2n + 1 for n > 1, T (1) = 4
► T (n) ≤ 4n2, for n ≥ 1
► T (n) ≤ cn2, for n ≥ n0 (c=4 and n0=1)
► we say T (n) is O(n2)
8. O notation
►T (n) is O(n2)
• There are positive constants c and n0 such
that T (n) ≤ cn2 for n ≥ n0
9. Some functions:
T1(n) = 5n2 T2(n) = n2+2n T3(n) = n+5
• T1(n) ≤ 5n2 for n ≥ 1
• T2(n) ≤ 2n2 for n ≥ 2
• T3(n) ≤ n2 for n ≥ 3
10. Generalizing…..
■ There exists positive constants c=5 and n0=1
such that T1(n) ≤ cn2 for n ≥ n0
■ There exists positive constants c=2 and n0=2
such that T2(n) ≤ cn2 for n ≥ n0
■ There exists positive constants c=1 and n0=3
such that T3(n) ≤ cn2 for n ≥ n0
There exists positive constants c and n0 such
that f (n) ≤ cn2 for n ≥ n0
11. The set O(n2) (read “big oh of n2” or “oh of n2”)
■ Set of all f (n) such that there exists positive
constants c and n0 such that f (n) ≤ cn2 for
all n ≥ n0
◻ Set is denoted by O(n2)
■ T1(n) Є O(n2) T2(n) Є O(n2) T3(n) Є O(n2)
■ O(n2) is a set of functions.
O(n2) = { f (n): there exists positive constants c and n0
such that 0 ≤ f (n) ≤ cn2 for all n ≥ n0 }
12. The set O(n2) – contd.
■ Give some more functions that belong to the
set O(n2).
◻ f1(n) = 100n2+ n+5
◻ f2(n) = 6n+3
◻ f3(n) = 10000n2
13. The set O(n3)
▪ O(n3) = { f (n): there exists positive constants
c and n0 such that 0 ≤ f (n) ≤ cn3 for all
n ≥ n0 }
▪ Some of the elements of O(n3)
◻ f4(n) = 100n3+ 3n2+2
◻ f5(n) = 6n+3
◻ f6(n) = 10000n2
14. Generalizing….
▪ O(n) = { f (n): there exists positive constants c and n0
such that 0 ≤ f (n) ≤ c n for all n ≥ n0 }
▪ O(n lg n) = { f (n): there exists positive constants c and
n0 such that 0 ≤ f (n) ≤ c n lg n for all n ≥ n0 }
O (g(n)) = { f (n): there exists positive constants c and
n0 such that 0 ≤ f (n) ≤ c. g(n) for all n ≥ n0 }
15. O-notation gives an upper bound for a function to within a
constant factor.
f(n)=O(g(n)), if there are positive constants c and n0 such that to the
right of n0, the value of f(n) always lies on or below cg(n).
Source: http://www.cs.unc.edu/~plaisted/comp122/02-asymp.ppt
16. Going back ….
■ Insertion Sort
◻ Worst Case Running time is O(n2)
■ Worst Case Running time, T1(n) ≤ cn2 for all values
of n ≥ n0 where c and n0 are positive constants.
Normally we write f (n) is O(g(n)) or f(n)=O(g(n)) to mean
“f (n) is a member of O(g(n))”
17. Prove T(n) = n3+20n+1 is O(n3)
■ by the Big-Oh definition, T(n) is O(n3) if T(n) ≤ c·n3
for some n ≥ n0
■ Find out c and n0
19. The set Ω(n) (Read big-omega of n)
■ An example[1]
T (n) = 2n + 3
2n ≤ T (n) for n ≥ 1
cn ≤ T (n) for n ≥ 1 and c = 2
T (n) belongs to Ω(n)
Ω(n) = { f (n): there exists positive constants c and n0
such that 0 ≤ c n ≤f (n) for all n ≥ n0 }
20. Exercises
1. Is 2n + 1 ∈ Ω(n)?
2. Is 2n2 + 10 ∈ Ω(n2) ?
3. Is n3 ∈ Ω(n2)?
21. The set Ω(g(n))
Ω(g(n))
= { f (n): there exists positive constants c
and n0 such that 0 ≤ c g(n) ≤ f (n) for
all n ≥ n0 }
22. Ω-notation gives a lower bound for a function to within a constant
factor
f(n) = Ω(g(n)), if there are positive constants c and n0 such that to
the right of n0, the value of f(n) always lies on or above cg(n).
Source: http://www.cs.unc.edu/~plaisted/comp122/02-asymp.ppt
23. The set Θ(n)
■ An example[1] - T (n) = 2n + 3
T (n) ≤ 6 n for n ≥ 1 T (n) is O(n )
2n ≤ T (n) for n ≥ 1 T (n) is Ω(n )
T (n) belongs to Θ(n)
Θ(n) = { f(n): there exists positive constants c1, c2 and
n0 such that 0 ≤ c1n ≤ f (n) ≤ c2n for all n ≥ n0 }
24. The set Θ(g(n))
Θ(g(n)) = { f (n): there exists positive
constants c1, c2 and n0 such that
0 ≤ c1 g(n) ≤ f (n) ≤ c2 g(n) for all
n ≥ n0 }
25. Θ-notation gives tight bound for a function to within constant
factors
f(n)=Θ(g(n)), if there exists positive constants c1, c2 and n0 such
that to the right of n0, the value of f(n) always lies between
c1 g(n) and c2 g(n) inclusive.
Source: http://www.cs.unc.edu/~plaisted/comp122/02-asymp.ppt
26. Asymptotic notations – Formal
definitions [1]
■ O(g (n)) = { f (n): there exists positive constants c
and n0 such that 0≤ f (n) ≤ c g(n) for all n≥n0 }
■ Ω(g (n))= { f (n): there exists positive constants c
and n0 such that 0≤ c g( n) ≤ f (n) all n≥n0 }
■ Θ(g (n))= { f (n): there exists positive constants c1 , c2
and n0 such that 0≤ c1 g(n) ≤ f (n) ≤ c2 g(n) for all
n≥n0 }
27. ■ f (n) = Θ(g (n))
◻ g(n) is an asymptotically tight bound for f (n)
■ f (n) = O(g (n))
◻ g(n) is an asymptotic upper bound for f (n)
■ f (n) = Ω(g (n))
◻ g(n) is an asymptotic lower bound for f (n)
28. Theorem [1]
For any two functions f(n) and g(n), we have
f (n)=Θ(g(n)) if and only if f (n)=О (g(n)) and
f (n)=Ω (g(n)).
Θ(f (n)) = О (f (n)) ∩ Ω (f (n)).
29. Examples[1]
• f (n) = an2+bn+c, where a, b, and c are
constants and a>0
• f (n) = Θ(n2) → f (n) = Ω(n2) and f (n) = О(n2)
▪For any polynomial, p(n) of degree k we
have p(n) = Θ(nk)
▪Any constant function is Θ(n0), or Θ(1).
30. Insertion Sort – Running Time
■ Best Case running Time is Ω(n).
◻ Implies Running time on any input is Ω(n).
■ Running time is not Ω(n2).
■ Worst Case running time is Ω(n2).
■ Is it correct to say best case running Time is
Θ(n)?
31. Is O(n lg n) algorithm preferred over O(n2)?
■ Suppose T1(n) ≤ 50 n lgn and T2(n) ≤ 2n2
■ Check the values of T1(n) and T2(n) when n= 2 and
n=1024
■ For small input sizes, the O(n2) algorithm may run faster.
■ Once the input size becomes large enough, O (n lgn) runs
faster
◻ irrespective of the constant factors
◻ irrespective of the implementation.
■ Read the corresponding Sections in CLRS.
32. References
1. Thomas H. Cormen, Charles E. Leiserson, Ronald
L. Rivest and Clifford Stein Introduction to
Algorithms, PHI, 2001.
2. Sara Baase and Allen Van Gelder Computer
Algorithms: Introduction to Design & Analysis,
Pearson Education, third edition, 2000.
3. Donald E Knuth. Big omicron and big omega and
big theta. ACM SIGACT News, 1976.
4. Gilles Brassard, Paul Bratley, Fundamentals of
Algorithmics, PHI, 1997.