Big O Notation
Hamza Mushtaq
Islamia University Bahwalpur
+Marcello Missiroli
IIS Corni, Modena
A Brief Overview
Time Complexity
In the time complexity analysis we define the time as a
function of the problem size and try to estimate the growth
of the execution time with the growth in problem size.
Memory Space
•The space require by the program to save input data and results in memory
(RAM).
History of Big-O-Notation
•Big O notation (with a capital letter O, not a zero), also called Landau's
symbol, is a symbolism used in complexity theory, computer science, and
mathematics to describe the basic behavior of functions. Basically, it tells
you how fast a function grows or declines.
•Landau's symbol comes from the name of the German mathematician
Edmund Landau who invented the notation.
•The letter O is used because the rate of growth of a function is also called
its order.
What is
Big O notation???
It describes
•Efficiency of an Algorithm
•Time Factor of an Algorithm
•Space Complexity of an Algorithm
It is represented by O(n) where n is the number of operations.
Calculations
•Estimate all base operations (assignments,
operations, I/O, Memory access)
•Sum them all
•Remove constants
•Keep largest term only
Example
•Evaluate:
Assignments: 4*n
Arithmetic operations: 2*n*n
I/O: log(n)
•Sum: 4*n+2*n*n+log(n)
•Remove constants &Keep largest term only 2*n*n
→ O(n
2
)
O(1)
void print(int index,int values[]){
cout<<“the value at index”<<index<< “ is
”<<value[index];
}
You can directly access the index element of the
array, so no matter how big the array gets, the time it
takes to access the element won’t change.
CONSTANT COMPLEXITY (VERY GOOD)
O(n)
void print(int index, int values[]){
for(int x=0;x<index;x++) cout<<index<<“
.”<<value[index]<<endl;
}
As you increase the size of the array, the time for this
algorithm will increase proportionally. An array of 50 will take
twice the time as an array of 25 and half the time of an array
of 100.
LINEAR COMPLEXITY (GOOD)
O(n2)
int CompareToAllNumbers (int array1[ ] ,int
index1,int array2[ ],int index2){
for (int x = 0; x < index1; x++)
{bool isMin; for (int y = 0; y <
index2; y++)
{ if( array[x] > array[y])
isMin = false; }
if(isMin) break; }
return array[x];}
If you add one element to array, you need to go through the entire
rest of the array one additional time. As the input grows, the time
required to run the algorithm will grow quite quickly.
POLYNOMIAL COMPLEXITY (BAD THOUGH TOLERABLE)
O(n!)
•Travelling salesman problem
The number of cities increases the runtime also
increases accordingly by a factor of cities factorial.
•Enumerating all possible chess moves
All legal positions can be stored on 4.45*10^46 bits
More than the number of molecules of earth.
EXPONENTIAL COMPLEXITY
– REALLY BAD “INTRACTABLE” PROBLEMS
Summary of Complexity
Constant complexity O(1)
Logarithmic complexity O(log(n))
Linear Complexity O(n)
Pseudolinear Complexity O(nlog(n))
Polynomial Complexity O(n
k
) k>=2
Exponential Complexity O(k
n
)
int CompareSmallestNumber (int array[ ])
{ int x, curMin;
curMin = array[0];
for (x = 1; x < 10; x++)
{ if( array[x] < curMin)
curMin = array[x];}
return curMin;}
int CompareToAllNumbers (int array[ ])
{bool is Min;
int x, y;
for (int x = 0; x < 10; x++)
{ isMin = true;
for (int y = 0; y < 10; y++)
{ if( array[x] > array[y])
isMin = false;
}if(isMin)
break; }
return array[x];}
We are having two different algorithms
to find the smallest element in the array
•Uses array to store elements
•Compares a temporary variable in the array
•The Big O notation for it is O(n)
This algorithm is efficient and effective as its
O is small as compare to the 2nd algorithm
And hence it gives the best result in less time.
•Uses array to store elements
•Compares the array with array itself
•The big O notation for it is O(n2)
This algorithm is less effective than the other
Beacause this algo has a high Big O and hence
It is more time taking and less effective.
Worst --Best ++
SCENARIOS
BEST
AVERAGE
WORST
Best case:
When the algorithm takes less
run time w.r.t the given inputs/data.
Best case also defines as Big Ω.
Average case:
When the algorithm takes
average run time w.r.t to the input /data.
This is also known as BigΘ.
Worst case:
When the algorithm takes higher runtime
as compared to the given inputs/data.
Worst case is also called as Big O.
Best case:
1
Big Ω =Ω(k)≈1.
Average case:
n/2
BigΘ=Θ(n)≈n.
Worst case:
n
Big O = O(n)≈n .
Linear search (n elem) Binary search (n elem)
Best case:
1
Big Ω =Ω(k)≈1.
Average case:
log2(n)-1.
BigΘ=Θ(n)≈log(n).
Worst case:
log2(n)
Big O = O(n)≈log(n) .
Best case (time):
Ω(n)≈n.
Worst case (time):
O(n
2
)≈n
2
.
Worst case (space):
Big O = O(n)≈
2
.
Insertion sort (n elem) Selection sort (n elem)
Best case (time):
Ω(n)≈n..
Worst case (time):
O(n
2
)≈n
2
.
Worst case (space):
O(n)≈
2
.
Best case (time):
Ω(n)≈n.
Worst case (time):
O(n
2
)≈n
2
.
Worst case (space):
Big O = O(n)≈n
2
.
Bubble sort (n elem) Stupid sort (n elem)
Best case (time):
Ω(n)≈n..
Worst case (time):
O(n!)≈n
!
.
Worst case (space):
O(∞)≈∞ .
Nothing
better?
Looking for nlog(n) in
sorting....
Shellsort
Quicksort
Mergesort

Big O Notation

  • 1.
    Big O Notation HamzaMushtaq Islamia University Bahwalpur +Marcello Missiroli IIS Corni, Modena A Brief Overview
  • 2.
    Time Complexity In thetime complexity analysis we define the time as a function of the problem size and try to estimate the growth of the execution time with the growth in problem size.
  • 3.
    Memory Space •The spacerequire by the program to save input data and results in memory (RAM).
  • 4.
    History of Big-O-Notation •BigO notation (with a capital letter O, not a zero), also called Landau's symbol, is a symbolism used in complexity theory, computer science, and mathematics to describe the basic behavior of functions. Basically, it tells you how fast a function grows or declines. •Landau's symbol comes from the name of the German mathematician Edmund Landau who invented the notation. •The letter O is used because the rate of growth of a function is also called its order.
  • 5.
    What is Big Onotation???
  • 6.
    It describes •Efficiency ofan Algorithm •Time Factor of an Algorithm •Space Complexity of an Algorithm It is represented by O(n) where n is the number of operations.
  • 7.
    Calculations •Estimate all baseoperations (assignments, operations, I/O, Memory access) •Sum them all •Remove constants •Keep largest term only
  • 8.
    Example •Evaluate: Assignments: 4*n Arithmetic operations:2*n*n I/O: log(n) •Sum: 4*n+2*n*n+log(n) •Remove constants &Keep largest term only 2*n*n → O(n 2 )
  • 9.
    O(1) void print(int index,intvalues[]){ cout<<“the value at index”<<index<< “ is ”<<value[index]; } You can directly access the index element of the array, so no matter how big the array gets, the time it takes to access the element won’t change. CONSTANT COMPLEXITY (VERY GOOD)
  • 10.
    O(n) void print(int index,int values[]){ for(int x=0;x<index;x++) cout<<index<<“ .”<<value[index]<<endl; } As you increase the size of the array, the time for this algorithm will increase proportionally. An array of 50 will take twice the time as an array of 25 and half the time of an array of 100. LINEAR COMPLEXITY (GOOD)
  • 11.
    O(n2) int CompareToAllNumbers (intarray1[ ] ,int index1,int array2[ ],int index2){ for (int x = 0; x < index1; x++) {bool isMin; for (int y = 0; y < index2; y++) { if( array[x] > array[y]) isMin = false; } if(isMin) break; } return array[x];} If you add one element to array, you need to go through the entire rest of the array one additional time. As the input grows, the time required to run the algorithm will grow quite quickly. POLYNOMIAL COMPLEXITY (BAD THOUGH TOLERABLE)
  • 12.
    O(n!) •Travelling salesman problem Thenumber of cities increases the runtime also increases accordingly by a factor of cities factorial. •Enumerating all possible chess moves All legal positions can be stored on 4.45*10^46 bits More than the number of molecules of earth. EXPONENTIAL COMPLEXITY – REALLY BAD “INTRACTABLE” PROBLEMS
  • 14.
    Summary of Complexity Constantcomplexity O(1) Logarithmic complexity O(log(n)) Linear Complexity O(n) Pseudolinear Complexity O(nlog(n)) Polynomial Complexity O(n k ) k>=2 Exponential Complexity O(k n )
  • 15.
    int CompareSmallestNumber (intarray[ ]) { int x, curMin; curMin = array[0]; for (x = 1; x < 10; x++) { if( array[x] < curMin) curMin = array[x];} return curMin;} int CompareToAllNumbers (int array[ ]) {bool is Min; int x, y; for (int x = 0; x < 10; x++) { isMin = true; for (int y = 0; y < 10; y++) { if( array[x] > array[y]) isMin = false; }if(isMin) break; } return array[x];} We are having two different algorithms to find the smallest element in the array
  • 16.
    •Uses array tostore elements •Compares a temporary variable in the array •The Big O notation for it is O(n) This algorithm is efficient and effective as its O is small as compare to the 2nd algorithm And hence it gives the best result in less time. •Uses array to store elements •Compares the array with array itself •The big O notation for it is O(n2) This algorithm is less effective than the other Beacause this algo has a high Big O and hence It is more time taking and less effective. Worst --Best ++
  • 17.
  • 18.
    Best case: When thealgorithm takes less run time w.r.t the given inputs/data. Best case also defines as Big Ω. Average case: When the algorithm takes average run time w.r.t to the input /data. This is also known as BigΘ. Worst case: When the algorithm takes higher runtime as compared to the given inputs/data. Worst case is also called as Big O.
  • 19.
    Best case: 1 Big Ω=Ω(k)≈1. Average case: n/2 BigΘ=Θ(n)≈n. Worst case: n Big O = O(n)≈n . Linear search (n elem) Binary search (n elem) Best case: 1 Big Ω =Ω(k)≈1. Average case: log2(n)-1. BigΘ=Θ(n)≈log(n). Worst case: log2(n) Big O = O(n)≈log(n) .
  • 20.
    Best case (time): Ω(n)≈n. Worstcase (time): O(n 2 )≈n 2 . Worst case (space): Big O = O(n)≈ 2 . Insertion sort (n elem) Selection sort (n elem) Best case (time): Ω(n)≈n.. Worst case (time): O(n 2 )≈n 2 . Worst case (space): O(n)≈ 2 .
  • 21.
    Best case (time): Ω(n)≈n. Worstcase (time): O(n 2 )≈n 2 . Worst case (space): Big O = O(n)≈n 2 . Bubble sort (n elem) Stupid sort (n elem) Best case (time): Ω(n)≈n.. Worst case (time): O(n!)≈n ! . Worst case (space): O(∞)≈∞ .
  • 22.
  • 23.
    Looking for nlog(n)in sorting.... Shellsort Quicksort Mergesort