2. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Outline: Design and Analysis of Algorithm
Introduction
Definition
Properties of Algorithm
Study of Algorithm
Complexity Analysis
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Ordered Notation
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Recurrence Relation
Substitution Method
Iteration Method
Master’s Theorem
Case Study on Quicksort
2 / 36
3. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Definition
Properties of Algorithm
Study of Algorithm
Definition
An Algorithm is a finite sequence of instruction, each of which has
a clear meaning and can be performed with a finite amount of effort
in a finite length of time.
Input
(e.g:Unsorted List)
Algorithm Output
(e.g.:Sorted List)
Reference: Aho, Hopcroft, and Ullman (1983)
3 / 36
4. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Definition
Properties of Algorithm
Study of Algorithm
Properties of Algorithm
Finiteness: The Algorithm Should be finite (i.e., there is no
infinite loop or malicious result)
Input: Zero, One or more inputs.
Output: At least one output.
Effectiveness: Instructions are realized i.e., performed in finite
time.
Definiteness: No ambiguity in the instruction i.e., multiple
way of instruction can be performed without confusion.
Reference: Horowitz, Sahni, and Rajasekaran (1997)
4 / 36
5. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Definition
Properties of Algorithm
Study of Algorithm
Study of Algorithm
The study of algorithm includes these five areas.
How to write/create an algorithm.
How to express an algorithm.
How to validate an algorithm.
How to analyze an algorithm.
How to test a program.
Reference: Horowitz, Sahni, and Rajasekaran (1997)
5 / 36
6. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Complexity
The complexity of an algorithm M is the function f (n) which gives
running time and/or storage space requirement of the algorithm in
term of the input size n of the input data.
Space Complexity
Time Complexity
Although the Space Complexity is a factor we observe that the com-
plexity in broad scenes is related to time complexity. Because nowa-
days storage is not much overhead with respect to time.
Reference: Seymour, L. (2010)
6 / 36
7. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Space Complexity
The amount of memory space
an algorithm need is called its
Space complexity
Ex.: Algo Sum (A, n)
// A is an array of size n
{
S := 0.0;
for i := 1 to n do
S := S + A [ i ];
return s;
}
Total Space required is
A → n words
S → 1 word
i → 1 word
n → 1 word
Total → (n+3) words
Reference: Horowitz, Sahni, and Rajasekaran (1997)
7 / 36
8. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Time Complexity
Time Complexity: Time spend by an algorithm to produce one or
more output
Theoretical analysis
Empirical analysis
Important Consideration:
Consider one operation takes one unit of time
Let for statement x ← x + y we need total time for execution
x ← x + y
1 unit
for i := 1 to n
x ← x + y
n unit
for i := 1 to n
for j := 1 to n
x ← x + y
n2 unit
Reference: Horowitz, Sahni, and Rajasekaran (1997)
8 / 36
9. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Rate of Growth of Standard function
Suppose M is an Algorithm, and n is in the size of input data.
Clearly the complexity f (n) of M increase as n increase. It is usually
the rate of increase of f (n) that we want to examine i.e., usually
done by computing f (n) with some standard function, such as
log n, n, n log n, n2, n3, 2n
Reference: Seymour, L. (2010)
9 / 36
10. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Rate of Growth of Standard function
n log n n n log n n2 n3 2n
4 2 4 8 16 64 16
5 3 5 15 25 125 32
10 4 10 40 100 103 103
100 7 100 700 104 106 1030
1000 10 103 104 106 109 10300
We can see the rate of growth of the logarithmic function log2n
grows slowest; the exponential function 2n grows most rapidly; and
function nc grows according to the exponent c.
Reference: Seymour, L. (2010)
10 / 36
11. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Graphical Representation
Reference: Horowitz, Sahni, and Rajasekaran (1997)
11 / 36
12. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Concept of Complexity
Space & Time Complexity
Standard Function
Type of Analysis
Type of Analysis
Worst case
Provides a maximum value of f (n) for any possible input
Provides an upper bound on running time
An absolute guarantee that the algorithm would not run
longer, no matter what the inputs are
Best case
Provides a minimum value of f (n) for any possible input
Provides a lower bound on running time
Input is the one for which the algorithm runs the fastest
Average case
Provides an expected value of f (n)
Provides a prediction about the running time
Assumes that the input is random
12 / 36
13. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Asymptotic Notation
These are the mathematical notions for computing running time
complexity of algorithm based on order of magnitude of the fre-
quency of execution of a statement.
O notation: asymptotic less than:
f(n)= O(g(n)) implies: f(n) ≤ g(n)
Ω notation: asymptotic greater than:
f(n)= Ω (g(n)) implies: f(n) ≥ g(n)
Θ notation: asymptotic equality:
f(n)= Θ (g(n)) implies: f(n) = g(n)
Reference: Aho, Hopcroft, and Ullman (1983)
13 / 36
14. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Big-Oh and Little-Oh
Definition (BigO, O()):
Let f (n) and g(n) be functions that map positive integers to positive
real numbers. We say that f (n) is O(g(n)) (or f (n) ∈ O(g(n)))
if there exists a real constant c > 0 and there exists an integer
constant n0 ≥ 1 such that 0 ≤ f (n) ≤ c.g(n)for every integer n ≥
n0.
Definition (Littleo, o()):
Let f (n) and g(n) be functions that map positive integers to positive
real numbers. We say that f (n) is o(g(n)) (or f (n) ∈ o(g(n))) if
for any real constant c > 0, there exists an integer constant n0 ≥ 1
such that f (n) < c.g(n) for every integer n ≥ n0.
Reference: McCann, (CS 345 2014)
14 / 36
15. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Big-Oh and Little-Oh
Reference: Cormen, T. H. (2009)
15 / 36
16. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Big-Omega and Little-Omega
Definition (BigOmega ,Ω()): Let f (n) and g(n) be functions that
map positive integers to positive real numbers. We say that f (n) is
Ωg(n)) (or f (n) ∈ Ω(g(n))) if there exists a real constant c > 0 and
there exists an integer constant n0 = 1 such that f (n) ≥ cg(n) ≥ 0
for every integer n = n0.
Definition (LittleOmega, ω()): Let f (n) and g(n) be functions
that map positive integers to positive real numbers. We say that
f (n) is ω(g(n)) (or f (n) ∈ ω(g(n))) if there exists a real constant
c > 0, there exists an integer constant n0 = 1 such that f (n) >
cg(n) for every integer n = n0.
Reference: McCann, (CS 345 2014)
16 / 36
17. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Big-Omega and Little-Omega
Reference: Cormen, T. H. (2009)
17 / 36
18. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Theta Notation
Definition (BigTheta, Θ()): Let f (n) and g(n) be functions that
map positive integers to positive real numbers. We say that f (n)
is Θ(g(n)) (or f (n) ∈ Θ(g(n))) if and only if f (n) ∈ O(g(n))and
f (n) ∈ Ω(g(n)).
Reference: McCann, (CS 345 2014)
18 / 36
20. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Relation Between Asymptotic Notations
Reference: McCann, (CS 345 2014)
20 / 36
21. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Example: Big-Oh Notation
Let f(n) = 7n + 8 and g(n) = n. Is f (n) ∈ O(g(n))?
For 7n + 8 ∈ O(n), we have to find c and n0 such that 7n + 8 ≤
c.n, ∀n ≥ n0.
By inspection, it is clear that c must be larger than 7. Let c = 8.
Now we need a suitable n0. In this case, f(8) = 8.g(8). Because the
definition of O() requires that f(n) ≤ c.g(n), we can select n0 = 8,
or any integer above 8. They will all work.
We have identified values for the constants c and n0 such that 7n
+ 8 is ≤ c.n for every n ≥ n0, so we can say that 7n + 8 is O(n).
Q: But how do we know that this will work for every n above 7?
A: We can prove by induction that 7n+8 ≤ 8n,∀n ≥ 8.
Reference: McCann, (CS 345 2014)
21 / 36
22. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Proof by Mathematical Induction
Proof 7n + 8 ≤ 8n for all n ≥ 8
Basic Step:
for n0 = n = 8
7.8 + 8 ≤ 64 → TRUE
Let n = k
7.k + 8 ≤ 8k → TRUE
Inductive Step:
for n = k+1
7.(k+1) + 8 ≤ 8(k+1)
7.k + 7 + 8 ≤ 8k + 8
(7.k + 8) + 7 ≤ 8k + 8
8.k + 7 ≤ 8k + 8 → TRUE
Hence it is proved that 7n + 8 ≤ 8n for all n ≥ 8
Reference: McCann, (CS 345 2014)
22 / 36
23. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Big-Oh and Little-Oh
Big-Omega and Little-Omega
Big-Theta
Examples of Asymptotic Notation
Example: Big-Omega Notation
√
n = Ω(log n)
f(n) =
√
n and g(n) = log n
For c = 1 and n0 = 16
By definition we have f(n) ≥ c.(g(n)) ∀ n ≥ n0
Now putting values of c, n0, f(n) and g(n) in definition we got√
16 ≥ 1. log 16
4 ≥ 1. log2 4
4 ≥ 4 → TRUE
Now putting n = 64 i.e. n ≥ n0√
64 ≥ 1.log 64
8 ≥ 1.log2 6
8 ≥ 6 → TRUE
Hence we got f(n) ≥ c.(g(n)) i.e.
√
n = Ω(log n) → TRUE
Reference: McCann, (CS 345 2014)
23 / 36
24. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Recurrence Relation
What is Recurrence?
It is an equation that describes the running time for recursive algo-
rithm Let for Divide and Conquer algorithms we can use the following
recurrence relation.
T(n) =
g(n) small n
2T(n/2) + f (n) Otherwise
Where T(n) is time complexity
g(n) is time to compute answer directly
f(n) is time for Divide and Conquer
What are the Methods to solve this?
Substitution Method
Iterative Method
Master Theorem
Reference: Horowitz, Sahni, and Rajasekaran (1997) 24 / 36
25. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Substitution Method
Let Takes Recurrence
T(1) = c1
T(n) = 2T(n/2) + c2.n
Solution:
We evaluate T(n/2) for n = n/2
T(n/2) = 2T(n/4) + c2.(n/2)
Substituting right side for T(n/2)
T(n) = 2(2T(n/4) + c2.n/2) + c2.n = 4T(n/4) + 2 c2 .n
Similarly we could substitute n/4 for n
T(n/4) = 2T(n/8) + c2.(n/4)
Substituting right side for T(n/4)
T(n) = 8T(n/8) + 3 c2. n
Now we have a pattern by induction on i
T(n) = 2i T(n/ 2i ) + i c2. n
Assuming n is power of 2, say 2k hence, for i = k we have
T(n) = 2k T(1) + k c2. n
Since 2k = n we know k = log n
T(n) = c1. n + c2. n log n
As T(n) is tight bound it is proves that T(n) = O (n log n)
Reference: Aho, Hopcroft, and Ullman (1983)
25 / 36
26. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Iteration Method
T(a) = Θ(1)
T(n) = T(n-a) + T(a) + n
Find T(n) using the iteration method.
Solution:
T(n) = T(n-a) + T(a) + n i = 1
= [T(n-2a) + T(a) + (n-a)] + T(a) + n
= T(n-2a) + 2T(a) + 2n a i = 2
= [T(n-3a) + T(a) + (n-2a)] + 2T(a) + 2n a
= T(n-3a) + 3T(a) + 3n - 2a a i = 3
= [T(n-4a) + T(a) + (n - 3a)] + 3T(a) + 3n - 2a a
= T(n-4a) + 4T(a) + 4n - 3a - 2a a i = 4
. . .
= T(n-ia) + iT(a) + in -
i−1
k=1
ka
= T(n-ia) + iT(a) + in - ai(i-1)/2 ith Step
After n/a steps, the iterations will stop. Assign i = n/a:
T(n) = T(0) + (n/a)T(a) + n2/a a(n/a - 1)(n/a)/2
= Θ(1) + (n/a)Θ(1) + n2 /a n2/2a + n/2
= Θ(1) + Θ(n) + n2/2a = Θ(n2)
T(n) = Θ(n2)
Reference: Aho, Hopcroft, and Ullman (1983)
26 / 36
27. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Master’s Theorem
The master’s method applies to recurrences of the form
T(n) = aT(n/b) + f (n),
where, a ≥ 1, b > 1, and f is asymptotically positive.
f (n) is a polynomial of the form
f (n) −→ nd
Hence for a recurrence relation of the form
T(n) = aT(n/b) + nd
Master’s’ theorem says that
T(n) =
Θ(nloga
b ) if a > bd
Θ(nd
log n) if a = bd
Θ(nd
) if a < bd
Reference: Leiserson and Demaine (2005)
27 / 36
28. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Master’s Theorem: Case 1
Compare f (n) with nloga
b :
1. f (n) = O(nloga
b − ) for some constant > 0
f (n) grows polynomially slower than nloga
b (by an n factor)
Solution: T(n) = Θ(nloga
b )
Ex. T(n) = 4T(n/2) + n
a = 4, b = 2 ⇒ nloga
b = n2, f (n) = n
CASE 1:f (n) = O(n2− ) for = 1
Alternatively:
a = 4, b = 2, d = 1, bd = 21 = 2, a > bd
T(n) = Θ(nlog4
2 ) = Θ(n2)
Reference: Leiserson and Demaine (2005)
28 / 36
29. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Master’s Theorem: Case 2
Compare f (n) with nloga
b :
2. f (n) = Θ(nloga
b logk
n) for some constant k ≥ 0
f (n) and nloga
b grows at a similar rate
Solution: T(n) = Θ(nloga
b logk+1
n)
Ex. T(n) = 4T(n/2) + n2
a = 4, b = 2 ⇒ nloga
b = n2, f (n) = n2
CASE 2:f (n) = O(n2 log0
n), that is k = 0
Alternatively:
a = 4, b = 2, d = 2, bd = 22 = 4, a = bd
T(n) = Θ(n2 log n)
Reference: Leiserson and Demaine (2005)
29 / 36
30. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Substitution Method
Iteration Method
Master’s Theorem
Master’s Theorem: Case 3
Compare f (n) with nloga
b :
3. f (n) = Ω(nloga
b + ) for some constant > 0
f (n) grows polynomially faster than nloga
b (by an n factor)
and f (n) satisfies the regularity condition that af (n/b) ≤ cf (n)
for some constant c < 1
Solution: T(n) = Θ(f (n))
Ex. T(n) = 4T(n/2) + n3
a = 4, b = 2 ⇒ nloga
b = n2, f (n) = n3
CASE 3:f (n) = O(n2+ ) for = 1
and 4(n/2)3 ≤ c.n3 (reg. cond.) for some c = 1/2
Alternatively:
a = 4, b = 2, d = 3, bd = 23 = 8, a < bd
T(n) = Θ(n3)
Reference: Leiserson and Demaine (2005) 30 / 36
31. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Quicksort Algorithm
1. Choose pivot
Take a point using rand(1, n)] Or
Take mid, n/2 of the array a[1, n]
2. Partition
Assign element a[i] < pivot to first sublist
Assign element a[i] > pivot to second sublist
3. Recursive sorts
sort a[1..k-1] //Sort first sublist
sort a[k+1,n] //Sort second sublist
4. Repeat step 1 to 3 for each sublist
- Tony Hoare, 1960
31 / 36
32. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Case Study on Quicksort
Best Case:
Quick sort partition array A[1, . . . , n] into exactly in half each time.
Hence, a recursive relation form as
T(n) = T(n/2) + T(n/2) + Θ(n)
= 2T(n/2) + Θ(n)
This is Master’s Theorem Case 2. So, the complexity become
O(n log n)
Reference: R B Muhammad (Accessed on March 2013)
32 / 36
33. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Case Study on Quicksort
Average Case:
The recurrence tree has depth (log n) and (n) work is performed
at (log n) of these level. This is an intuitive argument why the
average-case running time of QUICKSORT is
Θ(n log n)
.
Reference: R B Muhammad (Accessed on March 2013)
33 / 36
34. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Case Study on Quicksort
Worst Case:
The worst-case occurs if given array A[1, . . . , n] is already sorted.
Partition will split arrays of length
n, n − 1, n − 2, . . . , 2
and running time proportional to
n + (n − 1) + (n − 2) + . . . + 2 = [(n + 2)(n − 1)]/2 = Θ(n2
)
The worst-case also occurs if A[1, . . . , n] starts out in reverse order.
Reference: R B Muhammad (Accessed on March 2013)
34 / 36
35. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Reference
Books –
Aho, A. V., Hopcroft, J. E., & Ullman, J. D. (1983). Data structures and algorithms. Reading, MA:
Addison-Wesley.
Horowitz, E., Sahni, S., & Rajasekaran, S. (1997). Computer algorithms C++: C++ and pseudocode
versions. Macmillan.
Seymour, L. (2010). Data Structures with C. Tata McGraw Hill.
Cormen, T. H. (2009). Introduction to Algorithms. MIT press.
Online Course –
McCann, (2014) Analysis of Discrete Structures (CS 345),
https://www2.cs.arizona.edu/classes/cs345/summer14/
Leiserson, C., & Demaine, E, Introduction to Algorithms (SMA 5503). Fall 2005. MIT OpenCourseWare,
https://ocw.mit.edu.
Muhammad, R. B., Design and Analysis of Algorithm (Accessed on March 2013),
http://www.personal.kent.edu/ rmuhamma/Algorithms/algorithm.html
Web –
http://www.cs.odu.edu
http://www.cs.cf.ac.uk
http://ocw.mit.edu
35 / 36
36. Introduction
Complexity Analysis
Ordered Notation
Recurrence Relation
Case Study on Quicksort
Quicksort Algorithm
Best Case Complexity Analysis
Average Case Complexity Analysis
Worst Case Complexity Analysis
Thank You!
Thank You!
varun.kumar.ojha@gmail.com
Ojha V K, Design and analysis of algorithms.
36 / 36