This document provides an overview of data structures and algorithms. It defines key concepts like data structures, abstract data types, algorithms, asymptotic analysis and different algorithm design methods. It discusses analyzing time and space complexity of algorithms and introduces common asymptotic notations like Big-O, Omega and Theta notations. It also provides examples of different algorithm design techniques like divide and conquer, dynamic programming, greedy algorithms, backtracking and branch and bound.
2. Prof. Sonu Gupta
Data Structures
Representation of data and the operation allowed on
that data
Way to store and organize data to facilitate access
and modifications
Method of representing logical relationships between
individual data elements related to the solution of a
given problem
Matrix Tree
4. Prof. Sonu Gupta
Choice of Data Structure
It’s structure should be able to represent relationship
between data elements
It should be simple enough to effectively do required
processing on the data
No single data structure works well for all purposes
5. Prof. Sonu Gupta
Abstract Data Type
Collection of data and associated operations for
manipulating that data
Specification of an ADT tells what operations it does,
not their implementation
ADTs support abstraction, encapsulation, data hiding
Implementing an ADT involves choosing a data
structure
Core operations on ADT
Add an item
Remove an item
Find, retrieve or access an item
6. Prof. Sonu Gupta
ADT vs Data Structures
Program
Add
Remove
Find
Display
Data
Structures
Request to perform
operations
Result of operation
Interface
Wall of ADT
operations
Wall of ADT operations isolates program from data structure
8. Prof. Sonu Gupta
Algorithm
A well defined, finite step-by-step procedure
independent of programming language to achieve a
required result. It has following properties:-
• Input
Zero or more values
• Output
At least one value
• Definiteness
Each instruction precise and unambiguous
• Finiteness
Should terminate after finite number of steps
• Feasibility
Should be feasible with the available resources
9. Prof. Sonu Gupta
Examples of an Algorithm
Problem − Design an algorithm to add two numbers
and display the result.
step 1 − START
step 2 − declare three integers a, b & c
step 3 − define values of a & b
step 4 − add values of a & b
step 5 − store output of step 4 to c
step 6 − print c
step 7 − STOP
step 1 − START ADD
step 2 − get values of a & b
step 3 − c ← a + b
step 4 − display c
step 5 − STOP
10. Prof. Sonu Gupta
Pseudocode
Pseudo-code is a description of an algorithm that is
more structured than usual prose but less formal than
a programming language.
Example: finding the maximum element of an array.
Algorithm arrayMax(A, n):
Input: An array A storing n integers.
Output: The maximum element in A.
1 largest A[0]
2 for i 1 to n -1 do
2.1 if largest < A[i] then
2.1.1 largest A[i]
3 return largest
12. Prof. Sonu Gupta
Algorithms to Find Biggest of 3 Numbers
Algorithm 1
big = a
if(b > big)
big = b
if (c > big)
big = c
return big
Algorithm 2
if(a > b)
{ if(a > c)
return a
else
return c
}
else
{ if(b > c)
return b
else
return c
}
13. Prof. Sonu Gupta
Algorithmic Efficiency
More than one algorithms exist for solving one
problem
One algorithm might be more efficient than others
15. Prof. Sonu Gupta
Space Complexity
Amount of memory needed by program to run to
completion.
Components of Space Complexity
• Instruction space
• Data space
Space needed by constants, variables, dynamically
allocated objects
• Environmental stack space
Information needed to resume execution of partially
completed functions. Ex. Return address, values of
local variables and formal parameters
16. Prof. Sonu Gupta
Time Complexity
Amount of time program needs to run to completion.
Time complexity varies from system to system.
Two ways to calculate time complexity
Operation count: Identify one or more key operations &
determine number of times these are performed
(identify operations that contribute most to time
complexity)
Step Count: Determine total number of steps executed
by program
17. Prof. Sonu Gupta
Example to Calculate Time Complexity
algo sum()
{ s=0 --------- 1
for i= 1 to n --------- n + 1
s=s+a[i] ---------- n
return s ---------- 1
}
Total number of steps: 2n + 3
18. Prof. Sonu Gupta
Example to Calculate Time Complexity
algo sum()
{ for i= 1 to n --------- n + 1
for j = 1 to n --------- n(n + 1)
cout<<i*j; ---------- n *n
}
Total number of steps: 2n2+2n+1
20. Prof. Sonu Gupta
Growth of Function with Input Size
Rate of growth of the running
time - A function grows with
it’s input size.
Ex. A program for input
size n, takes 6n2 + 100n +
300 time. With increasing ‘n’,
6n2 becomes much larger
than 100n+ 300
Thus, important to analyze
performance of algorithm as
input size increases
21. Prof. Sonu Gupta
n n2 n2- n
1 1 0
2 4 2
3 9 6
4 16 12
5 25 20
6 36 30
7 49 42
8 64 56
9 81 72
10 100 90
11 121 110
12 144 132
13 169 156
14 196 182
15 225 210
16 256 240
17 289 272
18 324 306
19 361 342
20 400 380
As n increases, n2 becomes
much, much larger than n
22. Prof. Sonu Gupta
Worst, Average and Best Cases
Worst Case Analysis
Maximum time required for program execution
Calculate upper bound on running time of an algorithm
Ex. In Linear Search, element not present
Usually done
Average Case Analysis
Average time required for program execution
Sometimes done
Best Case Analysis
Minimum time required for program execution
Calculate lower bound on running time of an algorithm
Ex. In Linear Search, element present in first location
Never done
23. Prof. Sonu Gupta
Worst, Average and Best Cases
Input
1 ms
2 ms
3 ms
4 ms
5 ms
A B C D E F G
worst-case
best-case
}average-case?
24. Prof. Sonu Gupta
Asymptotic Analysis
In Asymptotic Analysis, we evaluate the performance
of an algorithm in terms of input size
It calculates, how does the time (or space) taken by
an algorithm increases with the input size.
Asymptotic notations are mostly used to represent
time complexity
O - Big Oh
Ω - Omega
Θ - Theta
25. Prof. Sonu Gupta
O-Notation
For functions f(n) and g(n), we say that f(n) = O(g(n) ) if
and only if there are positive constants c and n0 such
that f(n)≤ c g(n) for n ≥ n0.
g(n) should be as small as possible.
Used for worst case analysis (upper bound)
O(g(n)) = { f(n): there exist positive constants c and n0
such that 0 <= f(n) <= cg(n) for all n >= n0}
26. Prof. Sonu Gupta
Ω-Notation (Lower Bound)
For functions f(n) and g(n), we say that f(n) = (g(n) )
if and only if there exists positive constants c and n0
such that f(n) ≥ c g(n) for n ≥ n0.
g(n) should be as large as possible.
Used for best case analysis(Lower bound)
Ω (g(n)) = {f(n): there exist positive constants c and n0
such that 0 <= cg(n) <= f(n) for all n >= n0}.
27. Prof. Sonu Gupta
Θ-Notation
For functions f(n) and g(n), we say that f(n) = Θ(g(n))
if there exist positive constants n0, c1 and c2 such f(n)
always lies between c1g(n) and c2g(n) for n ≥ n0
g(n) is both upper and lower bound of f(n).
Used for average case analysis
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and
n0 such that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}
28. Prof. Sonu Gupta
f(n) = 4n+200
f(n) is O(n) as
4n+200 <=5n for all n>=200
(c=5, n0=200)
f(n) = 10n2+ 4n + 2
f(n) is O(n2) as
10n2+ 4n + 2 <= 11n2 for all n >=5
( c=11, n0= 5)
n 4n+200 5n
1 204 5
2 208 10
3 212 6
4 216 12
5 220 20
…. … …
199 996 995
200 1000 1000
201 1004 1005
202 1008 1010
29. Prof. Sonu Gupta
Calculation of Time complexity
Drop lower order terms and constant factors.
Remove coefficient of higher order term.
Examples
7n-3 is O(n)
10n3+100n2+11n+900 is O(n3)
31. Prof. Sonu Gupta
Disadvantages of Asymptotic Analysis
Fails if instances of our problem small
Fails if 2 algorithms have same tight bounds (Ex.
1000n2 and 2n2 – complexity – O(n2))
We often make simplifying assumptions when
analyzing which don’t hold true always
32. Prof. Sonu Gupta
Algorithm Design Methods
Divide and Conquer
Back Tracking Method
Dynamic Programming
Greedy Method
Brute Force
Branch and Bound
33. Prof. Sonu Gupta
Divide and Conquer
Based on dividing problem into sub-problems
Approach
• Divide problem into smaller sub-problems
Sub-problems must be of same type
Sub-problems do not need to overlap
• Solve each sub-problem recursively
• Combine solutions to solve original problem
Usually contains two or more recursive calls
Ex. Merge sort, Quick sort, Binary search
35. Prof. Sonu Gupta
Dynamic Programming
Similar to Divide and Conquer, but sub-problems
must overlap
Based on remembering past results
Approach
• Divide problem into smaller sub-problems
Sub-problems must be of same type
Sub-problems must overlap
• Solve each sub-problem recursively
Can use stored solution
Combine solutions into to solve original problem
37. Prof. Sonu Gupta
Back Tracking Method
Considers searching every possible combination in
order to solve an optimization problem
Approach
• Make any possible move
• If found solution, return it
• Else backtrack and select another move
• If no move remains, return failure
Ex. N Queen’s problem, Maze problem
39. Prof. Sonu Gupta
Greedy Method
Based on trying best current (local) choice
Approach
• At each step of algorithm Choose best local solution
• Avoid backtracking
Example – Minimum Spanning Tree algorithms
(Kruskal, Prims), Dijkstra shortest path, Huffman code
40. Prof. Sonu Gupta
Brute Force
Based on trying all possible solutions
Approach
• Generate and evaluate possible solutions until
Satisfactory solution is found
Best solution is found (if can be determined)
All possible solutions found
Return best solution
Return failure if no satisfactory solution
Generally most expensive approach
41. Prof. Sonu Gupta
Branch and Bound
Based on limiting search using current solution
Approach
• Track best current solution found
• Eliminate partial solutions that can not improve
upon best current solution
• Reduces amount of backtracking
42. Prof. Sonu Gupta
Heuristic
Based on trying to guide search for solution
Heuristic ⇒ “rule of thumb”
Approach
• Generate and evaluate possible solutions
Using “rule of thumb”
Stop if satisfactory solution is found
Can reduce complexity