SlideShare a Scribd company logo
1 of 51
Chapter 2
Introduction to
Algorithms
Dr. Muhammad Hanif Durad
Department of Computer and Information Sciences
Pakistan Institute Engineering and Applied Sciences
hanif@pieas.edu.pk
Some slides have bee adapted with thanks from some other lectures
available on Internet. It made my life easier, as life is always
miserable at PIEAS (Sir Muhammad Yusaf Kakakhil )
Dr. Hanif Durad 2
Lecture Outline
 Algorithm
 Analysis of Algorithms
 Computational Model
 Random Access Machine (RAM)
 Average, Worst, and Best Cases
 Higher order functions of n are normally considered less efficient
 Asymptotic Notation
 Q, O, W, o, w
 Why Does Growth Rate Matter?
Algorithm (1/2)
 Informally,
 A tool for solving a well-specified computational
problem.
 Example: sorting
input: A sequence of numbers.
output: An ordered permutation of the input.
AlgorithmInput Output
D:DSALCOMP 550-00101-algo.ppt
Dr. Hanif Durad
Algorithm (2/2)
 What is Algorithm?
 a clearly specified set of simple instructions to be
followed to solve a problem
 Takes a set of values, as input and
 produces a value, or set of values, as output
 Usually specified as a pseudo-code
 Data structures
 Methods of organizing data
 Program = algorithms + data structures
4
D:DSALCOMP171 Data Structures and Algorithmintro_algo.ppt
5
Analysis of Algorithms (1/2)
 Correctness:
 Does the algorithm do what is intended.
 Efficiecency:
 What is the running time of the algorithm.
 How much storage does it consume.
 Different algorithms may be correct
 Which should I use?
 Analysis of algorithms is to use mathematical
techniques to predict the efficiency of algorithms.
Dr. Hanif Durad
D:Data StructuresHanif_Searchch1intro.ppt+
D:DSALCD3570lecture1_introduction.pdf
CD3570
6
Analysis of Algorithms (2/2)
 What do we mean by efficiency?
 Efficiency is usually given with respect to some cost measure
 Cost measures are defined in terms of resource usage:
 Execution time
 Memory usage
 Communication bandwidth
 Computer hardware
 Energy consumption
 etc.
 We will mainly look at cost in terms of execution time
Dr. Hanif Durad
D:Data StructuresHanif_Searchch1intro.ppt+
D:DSALCD3570lecture1_introduction.pdf
Running-time of algorithms
 Bounds are for the algorithms, rather than
programs
 programs are just implementations of an algorithm, and
almost always the details of the program do not affect
the bounds
 Bounds are for algorithms, rather than problems
 A problem can be solved with several algorithms, some
are more efficient than others
D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
But, how to measure the time?
 Multiplication and addition: which one takes longer?
 How do we measure >=, assignment, &&, ||, etc etc
Machine dependent?
What is the efficiency of an
algorithm?
Run time in the computer: Machine Dependent
Example: Need to multiply two positive integers a and b
Subroutine 1: Multiply a and b
Subroutine 2: V = a, W = b
While W > 1
V V + a; W W-1
Output V
Solution: Machine Independent
Analysis
We assume that every basic operation takes constant time:
Example Basic Operations:
Addition, Subtraction, Multiplication, Memory Access
Non-basic Operations:
Sorting, Searching
Efficiency of an algorithm is the number of basic
operations it performs
We do not distinguish between the basic operations.
Subroutine 1 uses ? basic operation
Subroutine 2 uses ? basic operations
Subroutine ? is more efficient.
This measure is good for all large input sizes
In fact, we will not worry about the exact values, but will look at ``broad
classes’ of values, or the growth rates
Let there be n inputs.
If an algorithm needs n basic operations and another needs 2n basic
operations, we will consider them to be in the same efficiency
category.
However, we distinguish between exp(n), n, log(n)
Computational Model
 Should be simple, or even simplistic.
 Assign uniform cost for all simple operations and
memory accesses. (Not true in practice.)
 Question: Is this OK?
 Should be widely applicable.
 Can’t assume the model to support complex
operations. Ex: No SORT instruction.
 Size of a word of data is finite.
 Why? Dr. Hanif Durad 12
D:DSALCOMP 550-00101-algo.ppt
Random Access Machine (RAM)
 Generic single-processor model.
 Supports simple constant-time instructions found in real
computers.
 Arithmetic (+, –, *, /, %, floor, ceiling).
 Data Movement (load, store, copy).
 Control (branch, subroutine call).
 Run time (cost) is uniform (1 time unit) for all simple
instructions.
 Memory is unlimited.
 Flat memory model – no hierarchy.
 Access to a word of memory takes 1 time unit.
 Sequential execution – no concurrent operations. 13
D:DSALCOMP 550-00101-algo.ppt
14
Complexity
 Complexity is the number of steps required to solve a problem.
 The goal is to find the best algorithm to solve the problem with
a less number of steps
 Complexity of Algorithms
 The size of the problem is a measure of the quantity of the input data n
 The time needed by an algorithm, expressed as a function of the size
of the problem (it solves), is called the (time) complexity of the
algorithm T(n)
D:DSALAlgorithms and computational complexity
03_Growth_of_Functions_1.ppt, P-3
Dr. Hanif Durad
15
Basic idea: counting operations
 Running Time: Number of primitive steps that are executed
 most statements roughly require the same amount of time
 y = m * x + b
 c = 5 / 9 * (t - 32 )
 z = f(x) + g(y)
 Each algorithm performs a sequence of basic operations:
 Arithmetic: (low + high)/2
 Comparison: if ( x > 0 ) …
 Assignment: temp = x
 Branching: while ( true ) { … }
 …
Dr. Hanif Durad
16
Basic idea: counting operations
 Idea: count the number of basic operations
performed on the input.
 Difficulties:
 Which operations are basic?
 Not all operations take the same amount of time.
 Operations take different times with different
hardware or compilers
Dr. Hanif Durad
17
Measures of Algorithm
Complexity
 Let T(n) denote the number of operations required by an
algorithm to solve a given class of problems
 Often T(n) depends on the input, in such cases one can talk
about
 Worst-case complexity,
 Best-case complexity,
 Average-case complexity of an algorithm
 Alternatively, one can determine bounds (upper or lower)
on T(n)
Dr. Hanif Durad
18
Measures of Algorithm
Complexity
 Worst-Case Running Time: the longest time for any input
size of n
 provides an upper bound on running time for any input
 Best-Case Running Time: the shortest time for any input
size of n
 provides lower bound on running time for any input
 Average-Case Behavior: the expected performance
averaged over all possible inputs
 it is generally better than worst case behavior, but sometimes it’s
roughly as bad as worst case
 difficult to compute
Dr. Hanif Durad
Average, Worst, and Best Cases
 An algorithm may run faster on certain data sets
than others.
 Finding the average case can be very difficult,
so typically algorithms are measured in the
worst case time complexity.
 Also, in certain application domains (e.g., air
traffic control, medical, etc.) knowing the worst
case time complexity is of crucial importance.
Dr. Hanif Durad 19
D:Data StructuresICS202Lecture05.ppt
Worst vs. Average Case
Dr. Hanif Durad 20
D:Data StructuresICS202Lecture05.ppt
21
Example 1: Sum Series
Algorithm Step Count
1
2
3
4
1
2n+2
4n
1
Total 6n + 4
3
1
N
i
i


 Lines 1 and 4 count for one unit each
 Line 3: executed N times, each time four units
 Line 2: (1 for initialization, N+1 for all the tests, N for all the
increments) total 2N + 2
 total cost: 6N + 4  O(N)
D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
22
Example 2: Sequential Search
Algorithm Step Count
// Searches for x in array A of n items
// returns index of found item, or n+1 if not found
Seq_Search( A[n]: array, x: item){
done = false
i = 1
while ((i <= n) and (A[i] <> x)){
i = i +1
}
return i
}
0
1
1
n + 1
n
0
1
0
Total 2n + 4
23
Example: Sequential Search
 worst-case running time
 when x is not in the original array A
 in this case, while loop needs 2(n + 1) comparisons + c other
operations
 So, T(n) = 2n + 2 + c  Linear complexity
 best-case running time
 when x is found in A[1]
 in this case, while loop needs 2 comparisons + c other operations
 So, T(n) = 2 + c  Constant complexity
Dr. Hanif Durad
24
Order of Growth
 For very large input size, it is the rate of grow, or order of
growth that matters asymptotically
 We can ignore the lower-order terms, since they are
relatively insignificant for very large n
 We can also ignore leading term’s constant coefficients,
since they are not as important for the rate of growth in
computational efficiency for very large n
 Higher order functions of n are normally considered less
efficient
Dr. Hanif Durad
25
Asymptotic Notation
 Q, O, W, o, w
 Used to describe the running times of algorithms
 Instead of exact running time, say Q(n2)
 Defined for functions whose domain is the set of natural
numbers, N
 Determine sets of functions, in practice used to compare two
functions
Dr. Hanif Durad
26
Asymptotic Notation
 By now you should have an intuitive feel for
asymptotic (big-O) notation:
 What does O(n) running time mean? O(n2)?
O(n lg n)?
 Our first task is to define this notation more
formally and completely
Dr. Hanif Durad
27
Big-O notation
(Upper Bound – Worst Case)
 For a given function g(n), we denote by O(g(n)) the set of functions
 O(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that
0  f(n)  cg(n) for all n  n0 }
 We say g(n) is an asymptotic upper bound for f(n):
 O(g(n)) means that as n  , the execution time f(n) is at most c.g(n)
for some constant c
 What does O(g(n)) running time mean?
 The worst-case running time (upper-bound) is a function of g(n) to a
within a constant factor
 
)(
)(
lim0
ng
nf
n
Dr. Hanif Durad
28
Big-O notation
(Upper Bound – Worst Case)
time
nn0
f(n)
c.g(n)
f(n) = O(g(n))
Dr. Hanif Durad
29
O-notation
For a given function g(n), we
denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist
positive constants c and n0 such
that
0  f(n)  cg(n),
for all n  n0 }
We say g(n) is an asymptotic upper bound for f(n)
30
Big-O notation
(Upper Bound – Worst Case)
 This is a mathematically formal way of ignoring constant
factors, and looking only at the “shape” of the function
 f(n)=O(g(n)) should be considered as saying that “f(n) is at
most g(n), up to constant factors”.
 We usually will have f(n) be the running time of an
algorithm and g(n) a nicely written function
 e.g. The running time of insertion sort algorithm is O(n2)
 Example: 2n2 = O(n3), with c = 1 and n0 = 2.
31
Examples of functions in O(n2)
 n2
 n2 + n
 n2 + 1000n
 1000n2 + 1000n
Also,
 n
 n/1000
 n1.99999
 n2/ lg lg lg n
32
 Example1: Is 2n + 7 = O(n)?
 Let
 T(n) = 2n + 7
 T(n) = n (2 + 7/n)
 Note for n=7;
 2 + 7/n = 2 + 7/7 = 3
 T(n)  3 n ;  n  7
 Then T(n) = O(n)
 lim n [T(n) / n)] = 2  0  T(n) = O(n)
Big-O notation
(Upper Bound – Worst Case)
c
n0
33
Big-O notation
(Upper Bound – Worst Case)
 Example2: Is 5n3 + 2n2 + n + 106 = O(n3)?
 Let
 T(n) = 5n3 + 2n2 + n + 106
 T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)
 Note for n=100;
 5 + 2/n + 1/n2 + 106/n3 =
 5 + 2/100 + 1/10000 + 1 = 6.05
 T(n)  6.05 n3 ;  n  100 n0
c
 Then T(n) = O(n3)
 limn[T(n) / n3)] = 5  0  T(n) = O(n3)
34
Big-O notation
(Upper Bound – Worst Case)
 Express the execution time as a function of the input size n
 Since only the growth rate matters, we can ignore the multiplicative
constants and the lower order terms, e.g.,
 n, n+1, n+80, 40n, n+log n is O(n)
 n1.1 + 10000000000n is O(n1.1)
 n2 is O(n2)
 3n2 + 6n + log n + 24.5 is O(n2)
 O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) <
O(2sqrt(n)) < O(2n) < O(n!) < O(nn)
 Constant < Logarithmic < Linear < Quadratic< Cubic < Polynomial <
Factorial < Exponential
35
W-notation (Omega)
(Lower Bound – Best Case)
 For a given function g(n), we denote by W(g(n)) the set of functions
 W(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such
that 0  cg(n)  f(n) for all n  n0 }
 We say g(n) is an asymptotic lower bound for f(n):
 W(g(n)) means that as n  , the execution time f(n) is at least
c.g(n) for some constant c
 What does W(g(n)) running time mean?
 The best-case running time (lower-bound) is a function of g(n) to a
within a constant factor
 
)(
)(
lim0
ng
nf
n
36
W-notation
(Lower Bound – Best Case)
c.g(n)
time
nn0
f(n)
f(n) = W(g(n))
37
W-notation
For a given function g(n), we
denote by W(g(n)) the set of
functions
W(g(n)) = {f(n): there exist
positive constants c and n0
such that
0  cg(n)  f(n)
for all n  n0 }
We say g(n) is an asymptotic lower bound for f(n)
38
W-notation (Omega)
(Lower Bound – Best Case)
 We say Insertion Sort’s run time T(n) is W(n)
 For example
 the worst-case running time of insertion sort is O(n2),
and
 the best-case running time of insertion sort is W(n)
 Running time falls anywhere between a linear
function of n and a quadratic function of n2
 Example: √n = W(lg n), with c = 1 and n0 = 16.
39
Examples of functions in W(n2)
 n2
 n2 + n
 n2 − n
 1000n2 + 1000n
 1000n2 − 1000n
Also,
 n3
 n2.00001
 n2 lg lg lg n
40
Q notation (Theta)
(Tight Bound)
 In some cases,
 f(n) = O(g(n)) and f(n) = W(g(n))
 This means, that the worst and best cases require the
same amount of time t within a constant factor
 In this case we use a new notation called “theta Q”
 For a given function g(n), we denote by Q(g(n))
the set of functions
 Q(g(n)) = {f(n): there exist positive constants c1>0, c2
>0 and n0 >0 such that
 c g(n)  f(n)  c g(n)  n  n }
41
Q notation (Theta)
(Tight Bound)
 We say g(n) is an asymptotic tight bound for f(n):
 Theta notation
 (g(n)) means that as n  , the execution time f(n) is at most
c2.g(n) and at least c1.g(n) for some constants c1 and c2.
 f(n) = Q(g(n)) if and only if
 f(n) = O(g(n)) & f(n) = W(g(n))
 
)(
)(
lim0
ng
nf
n
42
Q notation (Theta)
(Tight Bound)
time
nn0
c1.g(n)
f(n)
f(n) = Q(g(n))
c2.g(n)
43
Q notation (Theta)
(Tight Bound)
 Example:
n2/2 − 2n = Q(n2), with c1 = 1/4, c2 = 1/2, and
n0 = 8.
44
o-notation
 For a given function g(n), we denote by o(g(n)) the set
of functions:
o(g(n)) = {f(n): for any positive constant c > 0, there
exists a constant n0 > 0 such that 0  f(n) < cg(n) for
all n  n0 }
 f(n) becomes insignificant relative to g(n) as n
approaches infinity: lim [f(n) / g(n)] = 0n
 We say g(n) is an upper bound for f(n) that is not
asymptotically tight.
45
O(*) versus o(*)
O(g(n)) = {f(n): there exist positive constants c and n0 such that 0
 f(n)  cg(n), for all n  n0 }.
o(g(n)) = {f(n): for any positive constant c > 0, there exists a
constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }.
Thus o(f(n)) is a weakened O(f(n)).
For example: n2 = O(n2)
n2  o(n2)
n2 = O(n3)
n2 = o(n3)
46
o-notation
 n1.9999 = o(n2)
 n2/ lg n = o(n2)
 n2  o(n2) (just like 2< 2)
 n2/1000  o(n2)
47
w-notation
 For a given function g(n), we denote by w(g(n)) the
set of functions
w(g(n)) = {f(n): for any positive constant c > 0, there
exists a constant n0 > 0 such that 0  cg(n) < f(n) for
all n  n0 }
 f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity: lim [f(n) / g(n)] = 
n
 We say g(n) is a lower bound for f(n) that is not
asymptotically tight.
48
w-notation
 n2.0001 = ω(n2)
 n2 lg n = ω(n2)
 n2  ω(n2)
49
Comparison of Functions
f  g  a  b
f (n) = O(g(n))  a  b
f (n) = W(g(n))  a  b
f (n) = Q(g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Why Does Growth Rate Matter?
Complexity 10 20 30
n 0.00001 sec 0.00002 sec 0.00003 sec
n2 0.0001 sec 0.0004 sec 0.0009 sec
n3 0.001 sec 0.008 sec 0.027 sec
n5 0.1 sec 3.2 sec 24.3 sec
2n 0.001 sec 1.0 sec 17.9 min
3n 0.59 sec 58 min 6.5 years
Why Does Growth Rate Matter?
Complexity 40 50 60
n 0.00004 sec 0.00005 sec 0.00006 sec
n2 0.016 sec 0.025 sec 0.036 sec
n3 0.064 sec 0.125 sec 0.216 sec
n5 1.7 min 5.2 min 13.0 min
2n 12.7 days 35.7 years 366 cent
3n 3855 cent 2 x 108 cent 1.3 x 1013 cent

More Related Content

What's hot

Page replacement algorithms
Page replacement algorithmsPage replacement algorithms
Page replacement algorithms
Piyush Rochwani
 
DESIGN AND ANALYSIS OF ALGORITHMS
DESIGN AND ANALYSIS OF ALGORITHMSDESIGN AND ANALYSIS OF ALGORITHMS
DESIGN AND ANALYSIS OF ALGORITHMS
Gayathri Gaayu
 

What's hot (20)

Page replacement algorithms
Page replacement algorithmsPage replacement algorithms
Page replacement algorithms
 
Priority queue in DSA
Priority queue in DSAPriority queue in DSA
Priority queue in DSA
 
Data Structure and Algorithms.pptx
Data Structure and Algorithms.pptxData Structure and Algorithms.pptx
Data Structure and Algorithms.pptx
 
Closest pair problems (Divide and Conquer)
Closest pair problems (Divide and Conquer)Closest pair problems (Divide and Conquer)
Closest pair problems (Divide and Conquer)
 
Daa notes 1
Daa notes 1Daa notes 1
Daa notes 1
 
Computer architecture pipelining
Computer architecture pipeliningComputer architecture pipelining
Computer architecture pipelining
 
Polyphase
PolyphasePolyphase
Polyphase
 
Program execution, straight line sequence and branching
Program execution, straight line sequence and branchingProgram execution, straight line sequence and branching
Program execution, straight line sequence and branching
 
Data Structure and Algorithms The Tower of Hanoi
Data Structure and Algorithms The Tower of HanoiData Structure and Algorithms The Tower of Hanoi
Data Structure and Algorithms The Tower of Hanoi
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
INTRODUCTION TO ALGORITHMS Third Edition
INTRODUCTION TO ALGORITHMS Third EditionINTRODUCTION TO ALGORITHMS Third Edition
INTRODUCTION TO ALGORITHMS Third Edition
 
Expression trees
Expression treesExpression trees
Expression trees
 
DESIGN AND ANALYSIS OF ALGORITHMS
DESIGN AND ANALYSIS OF ALGORITHMSDESIGN AND ANALYSIS OF ALGORITHMS
DESIGN AND ANALYSIS OF ALGORITHMS
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithms
 
Computational Complexity
Computational ComplexityComputational Complexity
Computational Complexity
 
Divide and Conquer
Divide and ConquerDivide and Conquer
Divide and Conquer
 
Searching algorithms
Searching algorithmsSearching algorithms
Searching algorithms
 
Bubble sort
Bubble sortBubble sort
Bubble sort
 
Binary search
Binary searchBinary search
Binary search
 
Red black trees
Red black treesRed black trees
Red black trees
 

Similar to Chapter 2 ds

CS3114_09212011.ppt
CS3114_09212011.pptCS3114_09212011.ppt
CS3114_09212011.ppt
Arumugam90
 
2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf
ishan743441
 

Similar to Chapter 2 ds (20)

01-algo.ppt
01-algo.ppt01-algo.ppt
01-algo.ppt
 
DATA STRUCTURE.pdf
DATA STRUCTURE.pdfDATA STRUCTURE.pdf
DATA STRUCTURE.pdf
 
DATA STRUCTURE
DATA STRUCTUREDATA STRUCTURE
DATA STRUCTURE
 
Chapter two
Chapter twoChapter two
Chapter two
 
Algorithm Analysis.pdf
Algorithm Analysis.pdfAlgorithm Analysis.pdf
Algorithm Analysis.pdf
 
Data Structure and Algorithm chapter two, This material is for Data Structure...
Data Structure and Algorithm chapter two, This material is for Data Structure...Data Structure and Algorithm chapter two, This material is for Data Structure...
Data Structure and Algorithm chapter two, This material is for Data Structure...
 
Daa chapter 1
Daa chapter 1Daa chapter 1
Daa chapter 1
 
CS3114_09212011.ppt
CS3114_09212011.pptCS3114_09212011.ppt
CS3114_09212011.ppt
 
Big Oh.ppt
Big Oh.pptBig Oh.ppt
Big Oh.ppt
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm Analysis
 
Problem solving using computers - Unit 1 - Study material
Problem solving using computers - Unit 1 - Study materialProblem solving using computers - Unit 1 - Study material
Problem solving using computers - Unit 1 - Study material
 
2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf
 
DAA-Unit1.pptx
DAA-Unit1.pptxDAA-Unit1.pptx
DAA-Unit1.pptx
 
Design & Analysis of Algorithm course .pptx
Design & Analysis of Algorithm course .pptxDesign & Analysis of Algorithm course .pptx
Design & Analysis of Algorithm course .pptx
 
Algorithms
Algorithms Algorithms
Algorithms
 
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
 
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
 
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpall...
Algorithm Class at KPHB  (C, C++ Course Training Institute in KPHB, Kukatpall...Algorithm Class at KPHB  (C, C++ Course Training Institute in KPHB, Kukatpall...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpall...
 
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
Algorithm Class at KPHB (C, C++ Course Training Institute in KPHB, Kukatpally...
 
Complexity Analysis
Complexity Analysis Complexity Analysis
Complexity Analysis
 

More from Hanif Durad

More from Hanif Durad (20)

Chapter 26 aoa
Chapter 26 aoaChapter 26 aoa
Chapter 26 aoa
 
Chapter 25 aoa
Chapter 25 aoaChapter 25 aoa
Chapter 25 aoa
 
Chapter 24 aoa
Chapter 24 aoaChapter 24 aoa
Chapter 24 aoa
 
Chapter 23 aoa
Chapter 23 aoaChapter 23 aoa
Chapter 23 aoa
 
Chapter 12 ds
Chapter 12 dsChapter 12 ds
Chapter 12 ds
 
Chapter 11 ds
Chapter 11 dsChapter 11 ds
Chapter 11 ds
 
Chapter 10 ds
Chapter 10 dsChapter 10 ds
Chapter 10 ds
 
Chapter 9 ds
Chapter 9 dsChapter 9 ds
Chapter 9 ds
 
Chapter 8 ds
Chapter 8 dsChapter 8 ds
Chapter 8 ds
 
Chapter 7 ds
Chapter 7 dsChapter 7 ds
Chapter 7 ds
 
Chapter 6 ds
Chapter 6 dsChapter 6 ds
Chapter 6 ds
 
Chapter 5 ds
Chapter 5 dsChapter 5 ds
Chapter 5 ds
 
Chapter 4 ds
Chapter 4 dsChapter 4 ds
Chapter 4 ds
 
Chapter 3 ds
Chapter 3 dsChapter 3 ds
Chapter 3 ds
 
Chapter 5 pc
Chapter 5 pcChapter 5 pc
Chapter 5 pc
 
Chapter 4 pc
Chapter 4 pcChapter 4 pc
Chapter 4 pc
 
Chapter 3 pc
Chapter 3 pcChapter 3 pc
Chapter 3 pc
 
Chapter 2 pc
Chapter 2 pcChapter 2 pc
Chapter 2 pc
 
Chapter 1 pc
Chapter 1 pcChapter 1 pc
Chapter 1 pc
 
Chapter 6 pc
Chapter 6 pcChapter 6 pc
Chapter 6 pc
 

Recently uploaded

The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 

Recently uploaded (20)

How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfUnit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactistics
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Basic Intentional Injuries Health Education
Basic Intentional Injuries Health EducationBasic Intentional Injuries Health Education
Basic Intentional Injuries Health Education
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
latest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answerslatest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answers
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 

Chapter 2 ds

  • 1. Chapter 2 Introduction to Algorithms Dr. Muhammad Hanif Durad Department of Computer and Information Sciences Pakistan Institute Engineering and Applied Sciences hanif@pieas.edu.pk Some slides have bee adapted with thanks from some other lectures available on Internet. It made my life easier, as life is always miserable at PIEAS (Sir Muhammad Yusaf Kakakhil )
  • 2. Dr. Hanif Durad 2 Lecture Outline  Algorithm  Analysis of Algorithms  Computational Model  Random Access Machine (RAM)  Average, Worst, and Best Cases  Higher order functions of n are normally considered less efficient  Asymptotic Notation  Q, O, W, o, w  Why Does Growth Rate Matter?
  • 3. Algorithm (1/2)  Informally,  A tool for solving a well-specified computational problem.  Example: sorting input: A sequence of numbers. output: An ordered permutation of the input. AlgorithmInput Output D:DSALCOMP 550-00101-algo.ppt Dr. Hanif Durad
  • 4. Algorithm (2/2)  What is Algorithm?  a clearly specified set of simple instructions to be followed to solve a problem  Takes a set of values, as input and  produces a value, or set of values, as output  Usually specified as a pseudo-code  Data structures  Methods of organizing data  Program = algorithms + data structures 4 D:DSALCOMP171 Data Structures and Algorithmintro_algo.ppt
  • 5. 5 Analysis of Algorithms (1/2)  Correctness:  Does the algorithm do what is intended.  Efficiecency:  What is the running time of the algorithm.  How much storage does it consume.  Different algorithms may be correct  Which should I use?  Analysis of algorithms is to use mathematical techniques to predict the efficiency of algorithms. Dr. Hanif Durad D:Data StructuresHanif_Searchch1intro.ppt+ D:DSALCD3570lecture1_introduction.pdf CD3570
  • 6. 6 Analysis of Algorithms (2/2)  What do we mean by efficiency?  Efficiency is usually given with respect to some cost measure  Cost measures are defined in terms of resource usage:  Execution time  Memory usage  Communication bandwidth  Computer hardware  Energy consumption  etc.  We will mainly look at cost in terms of execution time Dr. Hanif Durad D:Data StructuresHanif_Searchch1intro.ppt+ D:DSALCD3570lecture1_introduction.pdf
  • 7. Running-time of algorithms  Bounds are for the algorithms, rather than programs  programs are just implementations of an algorithm, and almost always the details of the program do not affect the bounds  Bounds are for algorithms, rather than problems  A problem can be solved with several algorithms, some are more efficient than others D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
  • 8. But, how to measure the time?  Multiplication and addition: which one takes longer?  How do we measure >=, assignment, &&, ||, etc etc Machine dependent?
  • 9. What is the efficiency of an algorithm? Run time in the computer: Machine Dependent Example: Need to multiply two positive integers a and b Subroutine 1: Multiply a and b Subroutine 2: V = a, W = b While W > 1 V V + a; W W-1 Output V
  • 10. Solution: Machine Independent Analysis We assume that every basic operation takes constant time: Example Basic Operations: Addition, Subtraction, Multiplication, Memory Access Non-basic Operations: Sorting, Searching Efficiency of an algorithm is the number of basic operations it performs We do not distinguish between the basic operations.
  • 11. Subroutine 1 uses ? basic operation Subroutine 2 uses ? basic operations Subroutine ? is more efficient. This measure is good for all large input sizes In fact, we will not worry about the exact values, but will look at ``broad classes’ of values, or the growth rates Let there be n inputs. If an algorithm needs n basic operations and another needs 2n basic operations, we will consider them to be in the same efficiency category. However, we distinguish between exp(n), n, log(n)
  • 12. Computational Model  Should be simple, or even simplistic.  Assign uniform cost for all simple operations and memory accesses. (Not true in practice.)  Question: Is this OK?  Should be widely applicable.  Can’t assume the model to support complex operations. Ex: No SORT instruction.  Size of a word of data is finite.  Why? Dr. Hanif Durad 12 D:DSALCOMP 550-00101-algo.ppt
  • 13. Random Access Machine (RAM)  Generic single-processor model.  Supports simple constant-time instructions found in real computers.  Arithmetic (+, –, *, /, %, floor, ceiling).  Data Movement (load, store, copy).  Control (branch, subroutine call).  Run time (cost) is uniform (1 time unit) for all simple instructions.  Memory is unlimited.  Flat memory model – no hierarchy.  Access to a word of memory takes 1 time unit.  Sequential execution – no concurrent operations. 13 D:DSALCOMP 550-00101-algo.ppt
  • 14. 14 Complexity  Complexity is the number of steps required to solve a problem.  The goal is to find the best algorithm to solve the problem with a less number of steps  Complexity of Algorithms  The size of the problem is a measure of the quantity of the input data n  The time needed by an algorithm, expressed as a function of the size of the problem (it solves), is called the (time) complexity of the algorithm T(n) D:DSALAlgorithms and computational complexity 03_Growth_of_Functions_1.ppt, P-3 Dr. Hanif Durad
  • 15. 15 Basic idea: counting operations  Running Time: Number of primitive steps that are executed  most statements roughly require the same amount of time  y = m * x + b  c = 5 / 9 * (t - 32 )  z = f(x) + g(y)  Each algorithm performs a sequence of basic operations:  Arithmetic: (low + high)/2  Comparison: if ( x > 0 ) …  Assignment: temp = x  Branching: while ( true ) { … }  … Dr. Hanif Durad
  • 16. 16 Basic idea: counting operations  Idea: count the number of basic operations performed on the input.  Difficulties:  Which operations are basic?  Not all operations take the same amount of time.  Operations take different times with different hardware or compilers Dr. Hanif Durad
  • 17. 17 Measures of Algorithm Complexity  Let T(n) denote the number of operations required by an algorithm to solve a given class of problems  Often T(n) depends on the input, in such cases one can talk about  Worst-case complexity,  Best-case complexity,  Average-case complexity of an algorithm  Alternatively, one can determine bounds (upper or lower) on T(n) Dr. Hanif Durad
  • 18. 18 Measures of Algorithm Complexity  Worst-Case Running Time: the longest time for any input size of n  provides an upper bound on running time for any input  Best-Case Running Time: the shortest time for any input size of n  provides lower bound on running time for any input  Average-Case Behavior: the expected performance averaged over all possible inputs  it is generally better than worst case behavior, but sometimes it’s roughly as bad as worst case  difficult to compute Dr. Hanif Durad
  • 19. Average, Worst, and Best Cases  An algorithm may run faster on certain data sets than others.  Finding the average case can be very difficult, so typically algorithms are measured in the worst case time complexity.  Also, in certain application domains (e.g., air traffic control, medical, etc.) knowing the worst case time complexity is of crucial importance. Dr. Hanif Durad 19 D:Data StructuresICS202Lecture05.ppt
  • 20. Worst vs. Average Case Dr. Hanif Durad 20 D:Data StructuresICS202Lecture05.ppt
  • 21. 21 Example 1: Sum Series Algorithm Step Count 1 2 3 4 1 2n+2 4n 1 Total 6n + 4 3 1 N i i    Lines 1 and 4 count for one unit each  Line 3: executed N times, each time four units  Line 2: (1 for initialization, N+1 for all the tests, N for all the increments) total 2N + 2  total cost: 6N + 4  O(N) D:Data StructuresCOMP171 Data Structures and Algorithmalgo.ppt
  • 22. 22 Example 2: Sequential Search Algorithm Step Count // Searches for x in array A of n items // returns index of found item, or n+1 if not found Seq_Search( A[n]: array, x: item){ done = false i = 1 while ((i <= n) and (A[i] <> x)){ i = i +1 } return i } 0 1 1 n + 1 n 0 1 0 Total 2n + 4
  • 23. 23 Example: Sequential Search  worst-case running time  when x is not in the original array A  in this case, while loop needs 2(n + 1) comparisons + c other operations  So, T(n) = 2n + 2 + c  Linear complexity  best-case running time  when x is found in A[1]  in this case, while loop needs 2 comparisons + c other operations  So, T(n) = 2 + c  Constant complexity Dr. Hanif Durad
  • 24. 24 Order of Growth  For very large input size, it is the rate of grow, or order of growth that matters asymptotically  We can ignore the lower-order terms, since they are relatively insignificant for very large n  We can also ignore leading term’s constant coefficients, since they are not as important for the rate of growth in computational efficiency for very large n  Higher order functions of n are normally considered less efficient Dr. Hanif Durad
  • 25. 25 Asymptotic Notation  Q, O, W, o, w  Used to describe the running times of algorithms  Instead of exact running time, say Q(n2)  Defined for functions whose domain is the set of natural numbers, N  Determine sets of functions, in practice used to compare two functions Dr. Hanif Durad
  • 26. 26 Asymptotic Notation  By now you should have an intuitive feel for asymptotic (big-O) notation:  What does O(n) running time mean? O(n2)? O(n lg n)?  Our first task is to define this notation more formally and completely Dr. Hanif Durad
  • 27. 27 Big-O notation (Upper Bound – Worst Case)  For a given function g(n), we denote by O(g(n)) the set of functions  O(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that 0  f(n)  cg(n) for all n  n0 }  We say g(n) is an asymptotic upper bound for f(n):  O(g(n)) means that as n  , the execution time f(n) is at most c.g(n) for some constant c  What does O(g(n)) running time mean?  The worst-case running time (upper-bound) is a function of g(n) to a within a constant factor   )( )( lim0 ng nf n Dr. Hanif Durad
  • 28. 28 Big-O notation (Upper Bound – Worst Case) time nn0 f(n) c.g(n) f(n) = O(g(n)) Dr. Hanif Durad
  • 29. 29 O-notation For a given function g(n), we denote by O(g(n)) the set of functions O(g(n)) = {f(n): there exist positive constants c and n0 such that 0  f(n)  cg(n), for all n  n0 } We say g(n) is an asymptotic upper bound for f(n)
  • 30. 30 Big-O notation (Upper Bound – Worst Case)  This is a mathematically formal way of ignoring constant factors, and looking only at the “shape” of the function  f(n)=O(g(n)) should be considered as saying that “f(n) is at most g(n), up to constant factors”.  We usually will have f(n) be the running time of an algorithm and g(n) a nicely written function  e.g. The running time of insertion sort algorithm is O(n2)  Example: 2n2 = O(n3), with c = 1 and n0 = 2.
  • 31. 31 Examples of functions in O(n2)  n2  n2 + n  n2 + 1000n  1000n2 + 1000n Also,  n  n/1000  n1.99999  n2/ lg lg lg n
  • 32. 32  Example1: Is 2n + 7 = O(n)?  Let  T(n) = 2n + 7  T(n) = n (2 + 7/n)  Note for n=7;  2 + 7/n = 2 + 7/7 = 3  T(n)  3 n ;  n  7  Then T(n) = O(n)  lim n [T(n) / n)] = 2  0  T(n) = O(n) Big-O notation (Upper Bound – Worst Case) c n0
  • 33. 33 Big-O notation (Upper Bound – Worst Case)  Example2: Is 5n3 + 2n2 + n + 106 = O(n3)?  Let  T(n) = 5n3 + 2n2 + n + 106  T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)  Note for n=100;  5 + 2/n + 1/n2 + 106/n3 =  5 + 2/100 + 1/10000 + 1 = 6.05  T(n)  6.05 n3 ;  n  100 n0 c  Then T(n) = O(n3)  limn[T(n) / n3)] = 5  0  T(n) = O(n3)
  • 34. 34 Big-O notation (Upper Bound – Worst Case)  Express the execution time as a function of the input size n  Since only the growth rate matters, we can ignore the multiplicative constants and the lower order terms, e.g.,  n, n+1, n+80, 40n, n+log n is O(n)  n1.1 + 10000000000n is O(n1.1)  n2 is O(n2)  3n2 + 6n + log n + 24.5 is O(n2)  O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) < O(2sqrt(n)) < O(2n) < O(n!) < O(nn)  Constant < Logarithmic < Linear < Quadratic< Cubic < Polynomial < Factorial < Exponential
  • 35. 35 W-notation (Omega) (Lower Bound – Best Case)  For a given function g(n), we denote by W(g(n)) the set of functions  W(g(n)) = {f(n): there exist positive constants c >0 and n0 >0 such that 0  cg(n)  f(n) for all n  n0 }  We say g(n) is an asymptotic lower bound for f(n):  W(g(n)) means that as n  , the execution time f(n) is at least c.g(n) for some constant c  What does W(g(n)) running time mean?  The best-case running time (lower-bound) is a function of g(n) to a within a constant factor   )( )( lim0 ng nf n
  • 36. 36 W-notation (Lower Bound – Best Case) c.g(n) time nn0 f(n) f(n) = W(g(n))
  • 37. 37 W-notation For a given function g(n), we denote by W(g(n)) the set of functions W(g(n)) = {f(n): there exist positive constants c and n0 such that 0  cg(n)  f(n) for all n  n0 } We say g(n) is an asymptotic lower bound for f(n)
  • 38. 38 W-notation (Omega) (Lower Bound – Best Case)  We say Insertion Sort’s run time T(n) is W(n)  For example  the worst-case running time of insertion sort is O(n2), and  the best-case running time of insertion sort is W(n)  Running time falls anywhere between a linear function of n and a quadratic function of n2  Example: √n = W(lg n), with c = 1 and n0 = 16.
  • 39. 39 Examples of functions in W(n2)  n2  n2 + n  n2 − n  1000n2 + 1000n  1000n2 − 1000n Also,  n3  n2.00001  n2 lg lg lg n
  • 40. 40 Q notation (Theta) (Tight Bound)  In some cases,  f(n) = O(g(n)) and f(n) = W(g(n))  This means, that the worst and best cases require the same amount of time t within a constant factor  In this case we use a new notation called “theta Q”  For a given function g(n), we denote by Q(g(n)) the set of functions  Q(g(n)) = {f(n): there exist positive constants c1>0, c2 >0 and n0 >0 such that  c g(n)  f(n)  c g(n)  n  n }
  • 41. 41 Q notation (Theta) (Tight Bound)  We say g(n) is an asymptotic tight bound for f(n):  Theta notation  (g(n)) means that as n  , the execution time f(n) is at most c2.g(n) and at least c1.g(n) for some constants c1 and c2.  f(n) = Q(g(n)) if and only if  f(n) = O(g(n)) & f(n) = W(g(n))   )( )( lim0 ng nf n
  • 42. 42 Q notation (Theta) (Tight Bound) time nn0 c1.g(n) f(n) f(n) = Q(g(n)) c2.g(n)
  • 43. 43 Q notation (Theta) (Tight Bound)  Example: n2/2 − 2n = Q(n2), with c1 = 1/4, c2 = 1/2, and n0 = 8.
  • 44. 44 o-notation  For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }  f(n) becomes insignificant relative to g(n) as n approaches infinity: lim [f(n) / g(n)] = 0n  We say g(n) is an upper bound for f(n) that is not asymptotically tight.
  • 45. 45 O(*) versus o(*) O(g(n)) = {f(n): there exist positive constants c and n0 such that 0  f(n)  cg(n), for all n  n0 }. o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }. Thus o(f(n)) is a weakened O(f(n)). For example: n2 = O(n2) n2  o(n2) n2 = O(n3) n2 = o(n3)
  • 46. 46 o-notation  n1.9999 = o(n2)  n2/ lg n = o(n2)  n2  o(n2) (just like 2< 2)  n2/1000  o(n2)
  • 47. 47 w-notation  For a given function g(n), we denote by w(g(n)) the set of functions w(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0  cg(n) < f(n) for all n  n0 }  f(n) becomes arbitrarily large relative to g(n) as n approaches infinity: lim [f(n) / g(n)] =  n  We say g(n) is a lower bound for f(n) that is not asymptotically tight.
  • 48. 48 w-notation  n2.0001 = ω(n2)  n2 lg n = ω(n2)  n2  ω(n2)
  • 49. 49 Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) = W(g(n))  a  b f (n) = Q(g(n))  a = b f (n) = o(g(n))  a < b f (n) = w (g(n))  a > b
  • 50. Why Does Growth Rate Matter? Complexity 10 20 30 n 0.00001 sec 0.00002 sec 0.00003 sec n2 0.0001 sec 0.0004 sec 0.0009 sec n3 0.001 sec 0.008 sec 0.027 sec n5 0.1 sec 3.2 sec 24.3 sec 2n 0.001 sec 1.0 sec 17.9 min 3n 0.59 sec 58 min 6.5 years
  • 51. Why Does Growth Rate Matter? Complexity 40 50 60 n 0.00004 sec 0.00005 sec 0.00006 sec n2 0.016 sec 0.025 sec 0.036 sec n3 0.064 sec 0.125 sec 0.216 sec n5 1.7 min 5.2 min 13.0 min 2n 12.7 days 35.7 years 366 cent 3n 3855 cent 2 x 108 cent 1.3 x 1013 cent