2. Order notation
Order notation is a mathematical method for
bounding the performance of an algorithm as its
size grows without bound.
It allows us to define and compare an
algorithm’s performance in a way that is free
from uncontrolled influences like machine load,
implementation efficiency, and so on.
3. Order notation
When trying to characterize an algorithm’s
efficiency in terms of execution time, independent
of any particular program or computer, it is
important to quantify the number of operations or
steps that the algorithm will require.
If each of these steps is considered to be a basic
unit of computation, then the execution time for an
algorithm can be expressed as the number of steps
required to solve the problem.
Deciding on an appropriate basic unit of
computation can be a complicated problem and will
depend on how the algorithm is implemented.
4. A1=n steps A2 = 5n
steps
A3 = N2
Steps
N=1 1 n sec 5 n sec 1 nsec
N=100 100 500 104
N=109 1 sec 5 1018 n sec
Asymptotic notations
Notation Name symbols Maths
1) Bigoh O <=
2) Big Omega Ω >=
3) Theta Θ =
4) Small Oh (o) <
5) Small Omega (w) >
For Compare running time Algo we are use Asymptotic notations
5. • The O (pronounced big-oh) is the formal method of
expressing the upper bound of an algorithm's running
time.
• It's a measure of the longest amount of time it could
possibly take for the algorithm to complete.
• We can assume that it represents the "worst case
scenario" of a program.
• More formally, for non-negative functions, f ( n ) and
g(n).
• if there exists an integer n0 and a constant c > 0 such
that for all integers n > n0
Big Oh Notation, Ο
6. F(n) = O(g(n)) if and only if
F(n)<= C.g(n) for some C>0 after n>= n0 >=0
then f(n) is big O of (g(n)).
For Example
F(n)=n g(n)=5n
F(n) = O(g(n))
F(n) <= C.g(n)
n <= C(5n) where c is positive int here we take 1
n <= 1(5n)
Big O notations
7. For Example 2
F(n)=2n+10 g(n)=n
F(n) = O(g(n)) ?
F(n) <= C.g(n)
c>0 , n>= n0 >=0
2n+10 <= C(n)
2n+10 <= 3n take c=3
10 <= 3n-2n
10 <= n
Big O notations
9. Growth
Rate
Name
1 Constant Ο(1)
log(n) Logarithmic Ο(log n)
n Linear O(n)
n log(n) Linearithmic Ο(n log n)
n^2 Quadratic Ο(n2)
n^3 Cubic Ο(n3)
2^n Exponential 2Ο(n)
n^ polynomial nΟ(1)
10. Growth
Rate
Name Code Example description
1 Constant a= b + 1;
statement (one
line of code)
log(n) Logarithmic while(n>1){ n=n/2; }
Divide in half
(binary search)
n Linear for(c=0; c<n; c++){ a+=1; } Loop
n*log(n) Linearithmic Mergesort, Quicksort, …
Effective sorting
algorithms
n^2 Quadratic
for(c=0; c<n; c++){
for(i=0; i<n; i++){
a+=1; } }
Double loop
n^3 Cubic
for(c=0; c<n; c++){
for(i=0; i<n; i++){
for(x=0; x<n; x++){
a+=1; } } }
Triple loop
2^n Exponential
Trying to braeak a password
generating all possible
combinations
Exhaustive
search
11. • The notation Ω(n) is the formal way to express the lower
bound of an algorithm's running time.
• It measures the best case time complexity or the best
amount of time an algorithm can possibly take to
complete.
Big Omega Ω
12. F(n) = O(g(n)) if and only if
F(n) >= C.g(n) for some C>0 after n>= n0 >=0
then f(n) is big Ω of (g(n)).
For Example
f(n) =3n+2 g(n)=n
f(n) = Ω g(n)
F(n) >= C.g(n)
3n+2 >= 1.n take c=1
Big Ω notations
14. • The notation θ(n) is the formal way to express both the
lower bound and the upper bound of an algorithm's
running time.
Theta Θ
15. F(n) = O(g(n)) if and only if
C1.g(n) <=F(n) <= C2.g(n) for some C1,c2> 0 after n>= n0 >=0
then f(n) is big Θ of (g(n)).
For Example
f(n) =3n+2 g(n)=n
F(n) >= C1.g(n)
3n+2 >= 2.n take c=2
F(n) <= C2.g(n)
3n+2 <= 4.n take c=4
Big Θ notations