SlideShare a Scribd company logo
1 of 98
Design and Analysis of Algorithms
CSE 350
Unit 1
Syllabus
Unit 1 Introduction
A Introduction : Algorithms, Analyzing algorithms, Complexity of
algorithms, Growth of functions, Performance measurements
B Asymptotic Notations and their properties – Mathematical
analysis for Recursive and Non-recursive algorithms,
Recurrences relations, Master Method
C Divide-and-conquer: Analysis and Structure of divide-and-
conquer algorithms, Divide-and-conquer examples-Quick sort,
Merge sort,
Sorting in Linear Time, Heap Sort
What is Algorithm?
 A finite set of instruction that specifies a sequence of operations to be carried out
in order to solve a specific problem or class of problems is called an Algorithm.
Why study Algorithm?
 As the speed of processor increases, performance is frequently said to be less
central than other software quality characteristics (e.g. security, extensibility,
reusability etc.).
 However, large problem sizes are commonplace in the area of computational
science, which makes performance a very important factor.
 The study of Algorithm, therefore, gives us a language to express performance as
a function of problem size.
L1.3
Why study algorithms and performance?
• Algorithms help us to understand scalability.
• Performance often draws the line between what is feasible and what is
impossible.
• Algorithmic mathematics provides a language for talking about program
behavior.
• The lessons of program performance generalize to other computing
resources.
• Speed is fun!
Algorithm: The theoretical study of computer-program performance
and resource usage.
What’s more important than performance?
• Modularity
• Correctness
• Maintainability
• Functionality
• Robustness
• User-friendliness
• Programmer Time
• Simplicity
• Extensibility
• Reliability
Characteristics of Algorithms
 Input: It should externally supply zero or more quantities.
 Output: It results in at least one quantity.
 Definiteness: Each instruction should be clear and ambiguous.
 Finiteness: An algorithm should terminate after executing a finite number of steps.
 Effectiveness: Every instruction should be fundamental to be carried out, in
principle, by a person using only pen and paper.
 Feasible: It must be feasible enough to produce each instruction.
 Flexibility: It must be flexible enough to carry out desired changes with no efforts.
 Efficient: The term efficiency is measured in terms of time and space required by
an algorithm to implement. Thus, an algorithm must ensure that it takes little time
and less memory space meeting the acceptable limit of development time.
 Independent: An algorithm must be language independent, which means that it
should mainly focus on the input and the procedure required to derive the output
instead of depending upon the language.
Advantages of an Algorithm
 Effective Communication: Since it is written in a natural language like English, it
becomes easy to understand the step-by-step delineation of a solution to any
particular problem.
 Easy Debugging: A well-designed algorithm facilitates easy debugging to detect
the logical errors that occurred inside the program.
 Easy and Efficient Coding: An algorithm is nothing but a blueprint of a program
that helps develop a program.
 Independent of Programming Language: Since it is a language-independent, it
can be easily coded by incorporating any high-level language.
Pseudocode
Pseudocode refers to an informal high-level description of the operating principle of a
computer program or other algorithm. It uses structural conventions of a standard
programming language intended for human reading rather than the machine reading.
Advantages of Pseudocode:
 Since it is similar to a programming language, it can be quickly transformed into
the actual programming language than a flowchart.
 The layman can easily understand it.
 Easily modifiable as compared to the flowcharts.
 Its implementation is beneficial for structured, designed elements.
 It can easily detect an error before transforming it into a code.
Problem: Suppose there are 60 students in the class. How will you
calculate the number of absentees in the class?
Pseudo Approach:
 Initialize a variable called as Count to zero, absent to zero, total to 60
 FOR EACH Student PRESENT DO the following:
Increase the Count by One
 Then Subtract Count from total and store the result in absent
 Display the number of absent students
Algorithmic Approach:
 Count <- 0, absent <- 0, total <- 60
 REPEAT till all students counted
Count <- Count + 1
 absent <- total - Count
 Print "Number absent is:" , absent
L1.4
The problem of sorting
Input: sequence a1, a2, …, an of numbers.
Output: permutation a'1, a'2, …, a'n such
that a'1  a'2  …  a'n .
Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
L1.5
Insertion Sort
INSERTION-SORT (A, n)
for j ← 1 to n
do key ← A[ j]
i ← j – 1
⊳ A[1 . . n]
while i > 0 and A[i] > key
do A[i+1] ← A[i]
i ← i – 1
A[i+1] = key
“pseudocode”
i j
key
sorted
L1.6
8 2 4 9 3 6
L1.7
8 2 4 9 3 6
L1.8
8 2 4 9 3 6
2 8 4 9 3 6
L1.19
8 2 4 9 3 6
2 8 4 9 3 6
L1.20
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
L1.21
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
L1.22
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.23
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.24
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.25
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.26
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
L1.27
Worst case: Input reverse sorted.
n
T(n)  ( j)  n2 
n
j2
Average case: All permutations equally likely.
T(n)  ( j /2)  n2 
j2
Is insertion sort a fast sorting algorithm?
• Moderately so, for small n.
• Not at all, for large n.
[arithmetic series]
 Insertion sort takes Q(n2) in the worst case, so sorting (as a problem)
is O(n2). Why?
 Any sort algorithm must look at each item, so sorting is W(n).
 In fact, using (e.g.) merge sort, sorting is Q(n lg n) in the worst case.
 Later, we will prove that we cannot hope that any comparison sort
to do better in the worst case.
Comp 122
Running time
L1.29
• The running time depends on the input: an already sorted sequence
is easier to sort.
• Parameterize the running time by the size of the input, since short
sequences are easier to sort than long ones.
• Generally, we seek upper bounds on the running time, because
everybody likes a guarantee.
L1.30
Worst-case: (usually)
• T(n) = maximum time of algorithm on any input of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm on any input of size n.
Best-case: (bogus)
• Cheat with a slow algorithm that works fast on some input.
L1.31
What is insertion sort’s worst-case time?
It depends on the speed of our computer:
• relative speed (on the same machine),
• absolute speed (on different machines).
BIG IDEA:
• Ignore machine-dependent constants.
• Look at growth of T(n) as n → ∞ .
“AsymptoticAnalysis”
Complexity of Algorithm
 Algorithm complexity measures how many steps are required by the algorithm to solve the
given problem.
 It evaluates the order of count of operations executed by an algorithm as a function of input
data size.
 O(f) notation represents the complexity of an algorithm, which is also termed as an
Asymptotic notation or "Big O" notation.
 Here the f corresponds to the function whose size is the same as that of the input data.
 The complexity of the asymptotic computation O(f) determines in which order the resources
such as CPU time, memory, etc. are consumed by the algorithm.
 The complexity can be found in any form such as constant, logarithmic, linear, n*log(n),
quadratic, cubic, exponential, etc.
Typical Complexities of an Algorithm
1. Constant Complexity:
It imposes a complexity of O(1). It undergoes an execution of a constant
number of steps like 1, 5, 10, etc. for solving a given problem. The count of
operations is independent of the input data size.
2. Logarithmic Complexity:
 It imposes a complexity of O(log(N)).
 It undergoes the execution of the order of log(N) steps.
 To perform operations on N elements, it often takes the logarithmic
base as 2.
 For N = 1,000,000, an algorithm that has a complexity of O(log(N))
would undergo 20 steps (with a constant precision).
3. Linear Complexity:
 It imposes a complexity of O(N).
 It encompasses the same number of steps as that of the total number
of elements to implement an operation on N elements.
 For example, if there exist 500 elements, then it will take about 500
steps. Basically, in linear complexity, the number of elements linearly
depends on the number of steps.
4. Quadratic Complexity:
 It imposes a complexity of O(n2).
 For N input data size, it undergoes the order of N2 count of operations
on N number of elements for solving a given problem.
 If N = 100, it will endure 10,000 steps.
 In other words, whenever the order of operation tends to have a
quadratic relation with the input data size, it results in quadratic
complexity.
5. Cubic Complexity:
 It imposes a complexity of O(n3).
 For N input data size, it executes the order of N3 steps on N elements to solve a
given problem.
 For example, if there exist 100 elements, it is going to execute 1,000,000 steps.
6. Exponential Complexity
 It imposes a complexity of O(2n), O(N!), O(nk), ….
 For N elements, it will execute the order of count of operations that is
exponentially dependable on the input data size.
 For example, if N = 10, then the exponential function 2N will result in 1024.
Similarly, if N = 20, it will result in 1048 576, and if N = 100, it will result in a
number having 30 digits.
 The exponential function N! grows even faster; for example, if N = 5 will
result in 120. Likewise, if N = 10, it will result in 3,628,800 and so on.
 How to approximate the time taken by the Algorithm?
 There are two types of algorithms:
 Iterative Algorithm: In the iterative approach, the function repeatedly
runs until the condition is met or it fails. It involves the looping
construct.
 Recursive Algorithm: In the recursive approach, the function calls
itself until the condition is met. It integrates the branching structure.
Asymptotic Notations
 Asymptotic Notation is a way of comparing function that ignores
constant factors and small input sizes. Three notations are used to
calculate the running time complexity of an algorithm:
 Defined for functions over the natural numbers.
 Ex: f(n) = Q(n2).
 Describes how f(n) grows in comparison to n2.
 Define a set of functions; in practice used to compare two function sizes.
 The notations describe different rate-of-growth relations between the
defining function and the defined set of functions.
-notation
(g(n)) = {f(n) :
 positive constants c1, c2, and
n0, such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)
}
For function g(n), we define (g(n)), big-Theta
of n, as the set:
g(n) is an asymptotically tight bound for f(n).
Intuitively: Set of all functions that
have the same rate of growth as g(n).
O-notation
O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }
For function g(n), we define O(g(n)), big-O of n,
as the set:
g(n) is an asymptotic upper bound for f(n).
Intuitively: Set of all functions whose
rate of growth is the same as or lower
than that of g(n).
f(n) = (g(n))  f(n) = O(g(n)).
(g(n))  O(g(n)).
 -notation
g(n) is an asymptotic lower bound for f(n).
Intuitively: Set of all functions whose
rate of growth is the same as or higher
than that of g(n).
f(n) = (g(n))  f(n) = (g(n)).
(g(n))  (g(n)).
(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  cg(n)  f(n)}
For function g(n), we define (g(n)), big-
Omega of n, as the set:
Relations Between Notations:
 OR (g(n)) = O(g(n)) Ç W(g(n))
 In practice, asymptotically tight bounds are obtained from asymptotic upper
and lower bounds.
Theorem : For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).
Asymptotic Notation in Equations
 Can use asymptotic notation in equations to replace expressions
containing lower-order terms.
 For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3). How to interpret?
 In equations, (f(n)) always stands for an anonymous function g(n) Î
(f(n))
 In the example above, (n2) stands for
3n2 + 2n + 1.
o-notation
f(n) becomes insignificant relative to g(n) as n approaches infinity:
lim [f(n) / g(n)] = 0
n
g(n) is an upper bound for f(n) that is not asymptotically tight.
o(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  f(n) < cg(n)}.
For a given function g(n), the set little-o:
w -notation
f(n) becomes arbitrarily large relative to g(n) as n approaches infinity:
lim [f(n) / g(n)] = .
n
g(n) is a lower bound for f(n) that is not asymptotically tight.
w(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  cg(n) < f(n)}.
For a given function g(n), the set little-omega:
Comparison of Functions
f  g  a  b
f (n) = O(g(n))  a  b
f (n) = (g(n))  a  b
f (n) = (g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Properties
 Transitivity
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n))
f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n))
 Reflexivity
f(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n))
 Symmetry
f(n) = (g(n)) iff g(n) = (f(n))
 Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = w((f(n))
Comp 122
Live Class (PDF 1)
 Substitution Method
 Recursion
 Master’s theorem
Merge sort
L1.54
MERGE-SORT A[1 . . n]
To sort n numbers:
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ] and
A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.
Key subroutine: MERGE
Merging two sorted arrays:
L1.55
20 12
13 11
7 9
2 1
L1.56
20 12
13 11
7 9
2 1
1
L1.57
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1
L1.58
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1 2
L1.59
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2
L1.60
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2 7
L1.61
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7
L1.62
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
L1.63
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
L1.64
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
L1.65
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
L1.66
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
L1.67
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
Time = (n) to merge a total of n elements (linear time).
Analyzing merge sort
L1.68
MERGE-SORT (A, n) ⊳ A[1 . . n]
To sort n numbers:
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists
T(n)
(1)
2T(n/2)
(n)
Abuse
Sloppiness: Should be T( n/2 ) + T( n/2 ) ,
but it turns out not to matter asymptotically.
Recurrence for merge sort
L1.69
T(n) =
(1) if n = 1;
2T(n/2) + (n) if n > 1.
• We shall usually omit stating the base case when T(n) = (1) for
sufficiently small n (and when it has no effect on the solution to
the recurrence).
• Further slides provide several ways to find a good upper bound
on T(n).
L1.70
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
L1.71
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
L1.72
T(n/2) T(n/2)
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
L1.73
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/4) T(n/4) T(n/4) T(n/4)
cn/2 cn/2
L1.74
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
L1.75
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
h = lg n
L1.76
cn/4 cn/4 cn/4 cn/4
cn/2 cn/2
(1)
h = lg n
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
L1.77
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4 cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
L1.78
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
(1)
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
…
L1.79
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
(1) #leaves = n (n)
…
L1.80
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn/4 cn/4
cn/2
h = lg n
cn cn
cn/2 cn
cn/4 cn/4 cn
(1) #leaves = n (n)
Total  (n lg n)
…
Conclusions
L1.81
• (n lg n) grows more slowly than (n2).
• Therefore, merge sort asymptotically beats insertion sort in
the worst case.
• In practice, merge sort beats insertion sort for n > 30 or so.
• Go test it out for yourself!
Divide And Conquer
This technique can be divided into the following three parts:
1. Divide: This involves dividing the problem into some sub problem.
2. Conquer: Sub problem by calling recursively until sub problem solved.
3. Combine: The Sub problem Solved so that we will get find problem
solution.
L3.3
Example:
(merge sort)
1. Divide: Trivial.
2. Conquer: Recursively sort 2 subarrays.
3. Combine: Linear-time merge.
T(n) = 2T(n/2) + O(n)
# subproblems
subproblem size
work dividing
and combining
Standard algorithms that follows Divide and Conquer algorithm
1. Quicksort is a sorting algorithm. The algorithm picks a pivot element, rearranges the
array elements in such a way that all elements smaller than the picked pivot element
move to left side of pivot, and all greater elements move to right side. Finally, the
algorithm recursively sorts the subarrays on left and right of pivot element.
2. Merge Sort is also a sorting algorithm. The algorithm divides the array in two halves,
recursively sorts them and finally merges the two sorted halves.
3. Closest Pair of Points The problem is to find the closest pair of points in a set of points
in x-y plane. The problem can be solved in O(n^2) time by calculating distances of every
pair of points and comparing the distances to find the minimum. The Divide and
Conquer algorithm solves the problem in O(nLogn) time.
4. Strassen’s Algorithm is an efficient algorithm to multiply two matrices. A
simple method to multiply two matrices need 3 nested loops and is O(n^3).
Strassen’s algorithm multiplies two matrices in O(n^2.8974) time.
5. Cooley–Tukey Fast Fourier Transform (FFT) algorithm is the most
common algorithm for FFT. It is a divide and conquer algorithm which works
in O(nlogn) time.
Live Class (PDF 2)
 Quick sort
 Sorting in Linear Time
 Heap Sort
Heap Sort
 If the parent node is stored at index I, the left child can be calculated by 2
* I + 1 and the right child by 2 * I + 2 (assuming the indexing starts at 0).
 Heap Sort Algorithm for sorting in increasing order:
1. Build a max heap from the input data.
2. At this point, the largest item is stored at the root of the heap. Replace
it with the last item of the heap followed by reducing the size of heap by
1. Finally, heapify the root of the tree.
3. Repeat step 2 while the size of the heap is greater than 1.
 Working of Heap Sort
1. Since the tree satisfies Max-Heap property, then the largest item is stored
at the root node.
2. Swap: Remove the root element and put at the end of the array (nth
position) Put the last item of the tree (heap) at the vacant place.
3. Remove: Reduce the size of the heap by 1.
4. Heapify: Heapify the root element again so that we have the highest
element at root.
5. The process is repeated until all the items of the list are sorted.
THANK YOU

More Related Content

Similar to CSE 350 Algorithms Design and Analysis

Algorithm in Computer, Sorting and Notations
Algorithm in Computer, Sorting  and NotationsAlgorithm in Computer, Sorting  and Notations
Algorithm in Computer, Sorting and NotationsAbid Kohistani
 
Aad introduction
Aad introductionAad introduction
Aad introductionMr SMAK
 
Analysis of Algorithm full version 2024.pptx
Analysis of Algorithm  full version  2024.pptxAnalysis of Algorithm  full version  2024.pptx
Analysis of Algorithm full version 2024.pptxrajesshs31r
 
Design Analysis of Alogorithm 1 ppt 2024.pptx
Design Analysis of Alogorithm 1 ppt 2024.pptxDesign Analysis of Alogorithm 1 ppt 2024.pptx
Design Analysis of Alogorithm 1 ppt 2024.pptxrajesshs31r
 
Asymptotic Notations
Asymptotic NotationsAsymptotic Notations
Asymptotic NotationsRishabh Soni
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm AnalysisMary Margarat
 
Asymptotic Notations.pptx
Asymptotic Notations.pptxAsymptotic Notations.pptx
Asymptotic Notations.pptxSunilWork1
 
2. Introduction to Algorithm.pptx
2. Introduction to Algorithm.pptx2. Introduction to Algorithm.pptx
2. Introduction to Algorithm.pptxRahikAhmed1
 
Asymptotics 140510003721-phpapp02
Asymptotics 140510003721-phpapp02Asymptotics 140510003721-phpapp02
Asymptotics 140510003721-phpapp02mansab MIRZA
 
Advanced Datastructures and algorithms CP4151unit1b.pdf
Advanced Datastructures and algorithms CP4151unit1b.pdfAdvanced Datastructures and algorithms CP4151unit1b.pdf
Advanced Datastructures and algorithms CP4151unit1b.pdfSheba41
 
Unit 1, ADA.pptx
Unit 1, ADA.pptxUnit 1, ADA.pptx
Unit 1, ADA.pptxjinkhatima
 
CP4151 Advanced data structures and algorithms
CP4151 Advanced data structures and algorithmsCP4151 Advanced data structures and algorithms
CP4151 Advanced data structures and algorithmsSheba41
 

Similar to CSE 350 Algorithms Design and Analysis (20)

Algorithm in Computer, Sorting and Notations
Algorithm in Computer, Sorting  and NotationsAlgorithm in Computer, Sorting  and Notations
Algorithm in Computer, Sorting and Notations
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithms
 
Aad introduction
Aad introductionAad introduction
Aad introduction
 
Analysis of Algorithm full version 2024.pptx
Analysis of Algorithm  full version  2024.pptxAnalysis of Algorithm  full version  2024.pptx
Analysis of Algorithm full version 2024.pptx
 
Design Analysis of Alogorithm 1 ppt 2024.pptx
Design Analysis of Alogorithm 1 ppt 2024.pptxDesign Analysis of Alogorithm 1 ppt 2024.pptx
Design Analysis of Alogorithm 1 ppt 2024.pptx
 
Asymptotic Notations
Asymptotic NotationsAsymptotic Notations
Asymptotic Notations
 
Algorithms
Algorithms Algorithms
Algorithms
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm Analysis
 
Algorithms overview
Algorithms overviewAlgorithms overview
Algorithms overview
 
chapter 1
chapter 1chapter 1
chapter 1
 
Asymptotic Notations.pptx
Asymptotic Notations.pptxAsymptotic Notations.pptx
Asymptotic Notations.pptx
 
2. Introduction to Algorithm.pptx
2. Introduction to Algorithm.pptx2. Introduction to Algorithm.pptx
2. Introduction to Algorithm.pptx
 
Algorithm
AlgorithmAlgorithm
Algorithm
 
Algorithm
AlgorithmAlgorithm
Algorithm
 
Asymptotics 140510003721-phpapp02
Asymptotics 140510003721-phpapp02Asymptotics 140510003721-phpapp02
Asymptotics 140510003721-phpapp02
 
Advanced Datastructures and algorithms CP4151unit1b.pdf
Advanced Datastructures and algorithms CP4151unit1b.pdfAdvanced Datastructures and algorithms CP4151unit1b.pdf
Advanced Datastructures and algorithms CP4151unit1b.pdf
 
Unit 1, ADA.pptx
Unit 1, ADA.pptxUnit 1, ADA.pptx
Unit 1, ADA.pptx
 
CP4151 Advanced data structures and algorithms
CP4151 Advanced data structures and algorithmsCP4151 Advanced data structures and algorithms
CP4151 Advanced data structures and algorithms
 
Searching Algorithms
Searching AlgorithmsSearching Algorithms
Searching Algorithms
 
Design & Analysis Of Algorithm
Design & Analysis Of AlgorithmDesign & Analysis Of Algorithm
Design & Analysis Of Algorithm
 

Recently uploaded

SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )Tsuyoshi Horigome
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfAsst.prof M.Gokilavani
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.eptoze12
 
microprocessor 8085 and its interfacing
microprocessor 8085  and its interfacingmicroprocessor 8085  and its interfacing
microprocessor 8085 and its interfacingjaychoudhary37
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130Suhani Kapoor
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AIabhishek36461
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...srsj9000
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort servicejennyeacort
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 

Recently uploaded (20)

SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.
 
microprocessor 8085 and its interfacing
microprocessor 8085  and its interfacingmicroprocessor 8085  and its interfacing
microprocessor 8085 and its interfacing
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AI
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 

CSE 350 Algorithms Design and Analysis

  • 1. Design and Analysis of Algorithms CSE 350 Unit 1
  • 2. Syllabus Unit 1 Introduction A Introduction : Algorithms, Analyzing algorithms, Complexity of algorithms, Growth of functions, Performance measurements B Asymptotic Notations and their properties – Mathematical analysis for Recursive and Non-recursive algorithms, Recurrences relations, Master Method C Divide-and-conquer: Analysis and Structure of divide-and- conquer algorithms, Divide-and-conquer examples-Quick sort, Merge sort, Sorting in Linear Time, Heap Sort
  • 3. What is Algorithm?  A finite set of instruction that specifies a sequence of operations to be carried out in order to solve a specific problem or class of problems is called an Algorithm.
  • 4. Why study Algorithm?  As the speed of processor increases, performance is frequently said to be less central than other software quality characteristics (e.g. security, extensibility, reusability etc.).  However, large problem sizes are commonplace in the area of computational science, which makes performance a very important factor.  The study of Algorithm, therefore, gives us a language to express performance as a function of problem size.
  • 5. L1.3 Why study algorithms and performance? • Algorithms help us to understand scalability. • Performance often draws the line between what is feasible and what is impossible. • Algorithmic mathematics provides a language for talking about program behavior. • The lessons of program performance generalize to other computing resources. • Speed is fun!
  • 6. Algorithm: The theoretical study of computer-program performance and resource usage. What’s more important than performance? • Modularity • Correctness • Maintainability • Functionality • Robustness • User-friendliness • Programmer Time • Simplicity • Extensibility • Reliability
  • 7. Characteristics of Algorithms  Input: It should externally supply zero or more quantities.  Output: It results in at least one quantity.  Definiteness: Each instruction should be clear and ambiguous.  Finiteness: An algorithm should terminate after executing a finite number of steps.  Effectiveness: Every instruction should be fundamental to be carried out, in principle, by a person using only pen and paper.
  • 8.  Feasible: It must be feasible enough to produce each instruction.  Flexibility: It must be flexible enough to carry out desired changes with no efforts.  Efficient: The term efficiency is measured in terms of time and space required by an algorithm to implement. Thus, an algorithm must ensure that it takes little time and less memory space meeting the acceptable limit of development time.  Independent: An algorithm must be language independent, which means that it should mainly focus on the input and the procedure required to derive the output instead of depending upon the language.
  • 9. Advantages of an Algorithm  Effective Communication: Since it is written in a natural language like English, it becomes easy to understand the step-by-step delineation of a solution to any particular problem.  Easy Debugging: A well-designed algorithm facilitates easy debugging to detect the logical errors that occurred inside the program.  Easy and Efficient Coding: An algorithm is nothing but a blueprint of a program that helps develop a program.  Independent of Programming Language: Since it is a language-independent, it can be easily coded by incorporating any high-level language.
  • 10. Pseudocode Pseudocode refers to an informal high-level description of the operating principle of a computer program or other algorithm. It uses structural conventions of a standard programming language intended for human reading rather than the machine reading. Advantages of Pseudocode:  Since it is similar to a programming language, it can be quickly transformed into the actual programming language than a flowchart.  The layman can easily understand it.  Easily modifiable as compared to the flowcharts.  Its implementation is beneficial for structured, designed elements.  It can easily detect an error before transforming it into a code.
  • 11. Problem: Suppose there are 60 students in the class. How will you calculate the number of absentees in the class?
  • 12. Pseudo Approach:  Initialize a variable called as Count to zero, absent to zero, total to 60  FOR EACH Student PRESENT DO the following: Increase the Count by One  Then Subtract Count from total and store the result in absent  Display the number of absent students
  • 13. Algorithmic Approach:  Count <- 0, absent <- 0, total <- 60  REPEAT till all students counted Count <- Count + 1  absent <- total - Count  Print "Number absent is:" , absent
  • 14. L1.4 The problem of sorting Input: sequence a1, a2, …, an of numbers. Output: permutation a'1, a'2, …, a'n such that a'1  a'2  …  a'n . Example: Input: 8 2 4 9 3 6 Output: 2 3 4 6 8 9
  • 15. L1.5 Insertion Sort INSERTION-SORT (A, n) for j ← 1 to n do key ← A[ j] i ← j – 1 ⊳ A[1 . . n] while i > 0 and A[i] > key do A[i+1] ← A[i] i ← i – 1 A[i+1] = key “pseudocode” i j key sorted
  • 16. L1.6 8 2 4 9 3 6
  • 17. L1.7 8 2 4 9 3 6
  • 18. L1.8 8 2 4 9 3 6 2 8 4 9 3 6
  • 19. L1.19 8 2 4 9 3 6 2 8 4 9 3 6
  • 20. L1.20 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6
  • 21. L1.21 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6
  • 22. L1.22 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6
  • 23. L1.23 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6
  • 24. L1.24 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6
  • 25. L1.25 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6
  • 26. L1.26 8 2 4 9 3 6 2 8 4 9 3 6 2 4 8 9 3 6 2 4 8 9 3 6 2 3 4 8 9 6 2 3 4 6 8 9 done
  • 27. L1.27 Worst case: Input reverse sorted. n T(n)  ( j)  n2  n j2 Average case: All permutations equally likely. T(n)  ( j /2)  n2  j2 Is insertion sort a fast sorting algorithm? • Moderately so, for small n. • Not at all, for large n. [arithmetic series]
  • 28.  Insertion sort takes Q(n2) in the worst case, so sorting (as a problem) is O(n2). Why?  Any sort algorithm must look at each item, so sorting is W(n).  In fact, using (e.g.) merge sort, sorting is Q(n lg n) in the worst case.  Later, we will prove that we cannot hope that any comparison sort to do better in the worst case. Comp 122
  • 29. Running time L1.29 • The running time depends on the input: an already sorted sequence is easier to sort. • Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones. • Generally, we seek upper bounds on the running time, because everybody likes a guarantee.
  • 30. L1.30 Worst-case: (usually) • T(n) = maximum time of algorithm on any input of size n. Average-case: (sometimes) • T(n) = expected time of algorithm on any input of size n. Best-case: (bogus) • Cheat with a slow algorithm that works fast on some input.
  • 31. L1.31 What is insertion sort’s worst-case time? It depends on the speed of our computer: • relative speed (on the same machine), • absolute speed (on different machines). BIG IDEA: • Ignore machine-dependent constants. • Look at growth of T(n) as n → ∞ . “AsymptoticAnalysis”
  • 32. Complexity of Algorithm  Algorithm complexity measures how many steps are required by the algorithm to solve the given problem.  It evaluates the order of count of operations executed by an algorithm as a function of input data size.  O(f) notation represents the complexity of an algorithm, which is also termed as an Asymptotic notation or "Big O" notation.  Here the f corresponds to the function whose size is the same as that of the input data.  The complexity of the asymptotic computation O(f) determines in which order the resources such as CPU time, memory, etc. are consumed by the algorithm.  The complexity can be found in any form such as constant, logarithmic, linear, n*log(n), quadratic, cubic, exponential, etc.
  • 33. Typical Complexities of an Algorithm 1. Constant Complexity: It imposes a complexity of O(1). It undergoes an execution of a constant number of steps like 1, 5, 10, etc. for solving a given problem. The count of operations is independent of the input data size.
  • 34. 2. Logarithmic Complexity:  It imposes a complexity of O(log(N)).  It undergoes the execution of the order of log(N) steps.  To perform operations on N elements, it often takes the logarithmic base as 2.  For N = 1,000,000, an algorithm that has a complexity of O(log(N)) would undergo 20 steps (with a constant precision).
  • 35. 3. Linear Complexity:  It imposes a complexity of O(N).  It encompasses the same number of steps as that of the total number of elements to implement an operation on N elements.  For example, if there exist 500 elements, then it will take about 500 steps. Basically, in linear complexity, the number of elements linearly depends on the number of steps.
  • 36. 4. Quadratic Complexity:  It imposes a complexity of O(n2).  For N input data size, it undergoes the order of N2 count of operations on N number of elements for solving a given problem.  If N = 100, it will endure 10,000 steps.  In other words, whenever the order of operation tends to have a quadratic relation with the input data size, it results in quadratic complexity.
  • 37. 5. Cubic Complexity:  It imposes a complexity of O(n3).  For N input data size, it executes the order of N3 steps on N elements to solve a given problem.  For example, if there exist 100 elements, it is going to execute 1,000,000 steps.
  • 38. 6. Exponential Complexity  It imposes a complexity of O(2n), O(N!), O(nk), ….  For N elements, it will execute the order of count of operations that is exponentially dependable on the input data size.  For example, if N = 10, then the exponential function 2N will result in 1024. Similarly, if N = 20, it will result in 1048 576, and if N = 100, it will result in a number having 30 digits.  The exponential function N! grows even faster; for example, if N = 5 will result in 120. Likewise, if N = 10, it will result in 3,628,800 and so on.
  • 39.  How to approximate the time taken by the Algorithm?  There are two types of algorithms:  Iterative Algorithm: In the iterative approach, the function repeatedly runs until the condition is met or it fails. It involves the looping construct.  Recursive Algorithm: In the recursive approach, the function calls itself until the condition is met. It integrates the branching structure.
  • 40. Asymptotic Notations  Asymptotic Notation is a way of comparing function that ignores constant factors and small input sizes. Three notations are used to calculate the running time complexity of an algorithm:
  • 41.  Defined for functions over the natural numbers.  Ex: f(n) = Q(n2).  Describes how f(n) grows in comparison to n2.  Define a set of functions; in practice used to compare two function sizes.  The notations describe different rate-of-growth relations between the defining function and the defined set of functions.
  • 42. -notation (g(n)) = {f(n) :  positive constants c1, c2, and n0, such that n  n0, we have 0  c1g(n)  f(n)  c2g(n) } For function g(n), we define (g(n)), big-Theta of n, as the set: g(n) is an asymptotically tight bound for f(n). Intuitively: Set of all functions that have the same rate of growth as g(n).
  • 43. O-notation O(g(n)) = {f(n) :  positive constants c and n0, such that n  n0, we have 0  f(n)  cg(n) } For function g(n), we define O(g(n)), big-O of n, as the set: g(n) is an asymptotic upper bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). f(n) = (g(n))  f(n) = O(g(n)). (g(n))  O(g(n)).
  • 44.  -notation g(n) is an asymptotic lower bound for f(n). Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). f(n) = (g(n))  f(n) = (g(n)). (g(n))  (g(n)). (g(n)) = {f(n) :  positive constants c and n0, such that n  n0, we have 0  cg(n)  f(n)} For function g(n), we define (g(n)), big- Omega of n, as the set:
  • 46.  OR (g(n)) = O(g(n)) Ç W(g(n))  In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. Theorem : For any two functions g(n) and f(n), f(n) = (g(n)) iff f(n) = O(g(n)) and f(n) = (g(n)).
  • 47. Asymptotic Notation in Equations  Can use asymptotic notation in equations to replace expressions containing lower-order terms.  For example, 4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n) = 4n3 + (n2) = (n3). How to interpret?  In equations, (f(n)) always stands for an anonymous function g(n) Î (f(n))  In the example above, (n2) stands for 3n2 + 2n + 1.
  • 48. o-notation f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0 n g(n) is an upper bound for f(n) that is not asymptotically tight. o(g(n)) = {f(n):  c > 0,  n0 > 0 such that  n  n0, we have 0  f(n) < cg(n)}. For a given function g(n), the set little-o:
  • 49. w -notation f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = . n g(n) is a lower bound for f(n) that is not asymptotically tight. w(g(n)) = {f(n):  c > 0,  n0 > 0 such that  n  n0, we have 0  cg(n) < f(n)}. For a given function g(n), the set little-omega:
  • 50. Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) = (g(n))  a  b f (n) = (g(n))  a = b f (n) = o(g(n))  a < b f (n) = w (g(n))  a > b
  • 51. Properties  Transitivity f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n))  Reflexivity f(n) = (f(n)) f(n) = O(f(n)) f(n) = (f(n))
  • 52.  Symmetry f(n) = (g(n)) iff g(n) = (f(n))  Complementarity f(n) = O(g(n)) iff g(n) = (f(n)) f(n) = o(g(n)) iff g(n) = w((f(n)) Comp 122
  • 53. Live Class (PDF 1)  Substitution Method  Recursion  Master’s theorem
  • 54. Merge sort L1.54 MERGE-SORT A[1 . . n] To sort n numbers: 1. If n = 1, done. 2. Recursively sort A[ 1 . . n/2 ] and A[ n/2+1 . . n ] . 3. “Merge” the 2 sorted lists. Key subroutine: MERGE
  • 55. Merging two sorted arrays: L1.55 20 12 13 11 7 9 2 1
  • 57. L1.57 20 12 20 12 13 11 13 11 7 9 7 9 2 1 2 1
  • 58. L1.58 20 12 20 12 13 11 13 11 7 9 7 9 2 1 2 1 2
  • 59. L1.59 20 12 20 12 20 12 13 11 13 11 13 11 7 9 7 9 7 9 2 1 2 1 2
  • 60. L1.60 20 12 20 12 20 12 13 11 13 11 13 11 7 9 7 9 7 9 2 1 2 1 2 7
  • 61. L1.61 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7
  • 62. L1.62 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7 9
  • 63. L1.63 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7 9
  • 64. L1.64 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 7 9 7 9 7 9 9 2 1 2 1 2 7 9 11
  • 65. L1.65 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2 1 2 7 9 11
  • 66. L1.66 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2 1 2 7 9 11 12
  • 67. L1.67 20 12 20 12 20 12 20 12 20 12 20 12 13 11 13 11 13 11 13 11 13 11 13 7 9 7 9 7 9 9 2 1 2 1 2 7 9 11 12 Time = (n) to merge a total of n elements (linear time).
  • 68. Analyzing merge sort L1.68 MERGE-SORT (A, n) ⊳ A[1 . . n] To sort n numbers: 1. If n = 1, done. 2. Recursively sort A[ 1 . . n/2 ] and A[ n/2+1 . . n ] . 3. “Merge” the 2 sorted lists T(n) (1) 2T(n/2) (n) Abuse Sloppiness: Should be T( n/2 ) + T( n/2 ) , but it turns out not to matter asymptotically.
  • 69. Recurrence for merge sort L1.69 T(n) = (1) if n = 1; 2T(n/2) + (n) if n > 1. • We shall usually omit stating the base case when T(n) = (1) for sufficiently small n (and when it has no effect on the solution to the recurrence). • Further slides provide several ways to find a good upper bound on T(n).
  • 70. L1.70 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
  • 71. L1.71 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)
  • 72. L1.72 T(n/2) T(n/2) Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn
  • 73. L1.73 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/4) T(n/4) T(n/4) T(n/4) cn/2 cn/2
  • 74. L1.74 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1)
  • 75. L1.75 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1) h = lg n
  • 76. L1.76 cn/4 cn/4 cn/4 cn/4 cn/2 cn/2 (1) h = lg n Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn
  • 77. L1.77 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn
  • 78. L1.78 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 (1) h = lg n cn cn cn/2 cn cn/4 cn/4 cn …
  • 79. L1.79 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 h = lg n cn cn cn/2 cn cn/4 cn/4 cn (1) #leaves = n (n) …
  • 80. L1.80 Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn/4 cn/4 cn/2 h = lg n cn cn cn/2 cn cn/4 cn/4 cn (1) #leaves = n (n) Total  (n lg n) …
  • 81. Conclusions L1.81 • (n lg n) grows more slowly than (n2). • Therefore, merge sort asymptotically beats insertion sort in the worst case. • In practice, merge sort beats insertion sort for n > 30 or so. • Go test it out for yourself!
  • 82. Divide And Conquer This technique can be divided into the following three parts: 1. Divide: This involves dividing the problem into some sub problem. 2. Conquer: Sub problem by calling recursively until sub problem solved. 3. Combine: The Sub problem Solved so that we will get find problem solution.
  • 83. L3.3 Example: (merge sort) 1. Divide: Trivial. 2. Conquer: Recursively sort 2 subarrays. 3. Combine: Linear-time merge. T(n) = 2T(n/2) + O(n) # subproblems subproblem size work dividing and combining
  • 84. Standard algorithms that follows Divide and Conquer algorithm 1. Quicksort is a sorting algorithm. The algorithm picks a pivot element, rearranges the array elements in such a way that all elements smaller than the picked pivot element move to left side of pivot, and all greater elements move to right side. Finally, the algorithm recursively sorts the subarrays on left and right of pivot element. 2. Merge Sort is also a sorting algorithm. The algorithm divides the array in two halves, recursively sorts them and finally merges the two sorted halves. 3. Closest Pair of Points The problem is to find the closest pair of points in a set of points in x-y plane. The problem can be solved in O(n^2) time by calculating distances of every pair of points and comparing the distances to find the minimum. The Divide and Conquer algorithm solves the problem in O(nLogn) time.
  • 85. 4. Strassen’s Algorithm is an efficient algorithm to multiply two matrices. A simple method to multiply two matrices need 3 nested loops and is O(n^3). Strassen’s algorithm multiplies two matrices in O(n^2.8974) time. 5. Cooley–Tukey Fast Fourier Transform (FFT) algorithm is the most common algorithm for FFT. It is a divide and conquer algorithm which works in O(nlogn) time.
  • 86. Live Class (PDF 2)  Quick sort  Sorting in Linear Time  Heap Sort
  • 87. Heap Sort  If the parent node is stored at index I, the left child can be calculated by 2 * I + 1 and the right child by 2 * I + 2 (assuming the indexing starts at 0).  Heap Sort Algorithm for sorting in increasing order: 1. Build a max heap from the input data. 2. At this point, the largest item is stored at the root of the heap. Replace it with the last item of the heap followed by reducing the size of heap by 1. Finally, heapify the root of the tree. 3. Repeat step 2 while the size of the heap is greater than 1.
  • 88.
  • 89.
  • 90.
  • 91.
  • 92.
  • 93.
  • 94.  Working of Heap Sort 1. Since the tree satisfies Max-Heap property, then the largest item is stored at the root node. 2. Swap: Remove the root element and put at the end of the array (nth position) Put the last item of the tree (heap) at the vacant place. 3. Remove: Reduce the size of the heap by 1. 4. Heapify: Heapify the root element again so that we have the highest element at root. 5. The process is repeated until all the items of the list are sorted.
  • 95.
  • 96.
  • 97.