Fundamentals of the Analysis
of Algorithm Efficiency
 Algorithm analysis framework
 Asymptotic notations
 Analysis of non-recursive
algorithms
 Analysis of recursive algorithms
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Analysis of Algorithms
Analysis of algorithms means to investigate an
algorithm’s efficiency with respect to resources:
running time and memory space.
Time efficiency: how fast an algorithm runs.
Space efficiency: the space an algorithm requires.
Space and Time complexity
Components that affect
space complexity:
•Program space
•Data space
•Stack space
Components that affect
time complexity:
•Speed of the computer
•Choice of prog. Language
•Compiler used
•Choice of algorithm
•No. of inputs/outputs
•Size of inputs
Note: since we have no control on
1,2,3 lets concern about 4,5,6
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Analysis Framework
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Measuring Input Sizes
• Typically, algorithms run longer as the size of its input
increases.
• We are interested in knowing how efficiency scales w.r.t
input size.
• Time efficiency of an algorithm depends on input size n and
hence it is expressed in terms of n.
• Input size depends on the problem.
 Example 1, what is the input size of the problem of sorting n
numbers?
 Example 2, what is the input size of adding two n by n matrices?
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Units for Measuring Running Time
Should we measure the running time using standard unit of
time measurements, such as seconds, minutes?
 Depends on the speed of the computer.
Given the input size of an algorithm, the time efficiency is
normally computed by considering the basic operation.
Count the number of times an algorithm’s basic operation is
executed.
 Basic operation: the operation that contributes the most to the total
running time.
 For example, the basic operation is usually the most time-consuming
operation in the algorithm’s innermost loop.
Ex1 :
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Ex2:
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Input size and basic operation
examples
Problem Input size measure Basic operation
Searching for key in a
list of n items
Number of list’s items,
i.e. n
Key comparison
Multiplication of two
matrices
Matrix dimensions or
total number of elements
Multiplication of
two numbers
Checking primality of
a given integer n
n’size = number of digits
(in binary representation)
Division
Typical graph
problem
#vertices and/or edges
Visiting a vertex or
traversing an edge
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Theoretical Analysis of Time Efficiency
Time efficiency is analyzed by determining the number of repetitions of the
basic operation as a function of input size.
The efficiency analysis
framework ignores
the multiplicative
constants of C(n) and
focuses on the orders
of growth of the C(n).
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Order of growth
Most important: Order of growth within a constant multiple
as n→∞
Some algo. works faster for smaller input size ‘n’ and
becomes slow as ‘n’ increases.
This kind of change in the behavior of an algo. as the value
of n increases is called Order of growth.
Example:
 How much faster will algorithm run on computer that is
twice as fast?
 How much longer does it take to solve problem of double
input size?
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Orders of Growth
Orders of growth:
• consider only the leading term of a formula
• ignore the constant coefficient.
Exponential-growth functions
Copyright  Li Zimao @ 2007-2008-1 SCUEC
0
100
200
300
400
500
600
700
1 2 3 4 5 6 7 8
n*n*n
n*n
n log(n)
n
log(n)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Worst-Case, Best-Case, and Average-
Case Efficiency
Algorithm efficiency depends on the input size n
For some algorithms efficiency depends on type of input.
 Example: Sequential Search
– Problem: Given a list of n elements and a search key K, find an element
equal to K, if any.
– Algorithm: Scan the list and compare its successive elements with K until
either a matching element is found (successful search) or the list is
exhausted (unsuccessful search)
Given a sequential search problem of an input size of n,
what kind of input would make the running time the longest?
How many key comparisons?
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Sequential Search Algorithm
ALGORITHM SequentialSearch(A[0..n-1], K)
//Searches for a given value in a given array by sequential search
//Input: An array A[0..n-1] and a search key K
//Output: Returns the index of the first element of A that matches K or –1 if there
are no matching elements
i 0
while i < n and A[i] ‡ K do
i  i + 1
if i < n //A[I] = K
return i
else
return -1
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Worst case Efficiency(at most time)
 Efficiency (# of times the basic operation will be executed) for the worst case input
of size n.
 The algorithm runs the longest among all possible inputs of size n.
Best case(at least time)
 Efficiency (# of times the basic operation will be executed) for the best case input of
size n.
 The algorithm runs the fastest among all possible inputs of size n.
Average case: (average)
 Efficiency (#of times the basic operation will be executed) for a typical/random
input of size n.
 NOT the average of worst and best case
 How to find the average case efficiency?
Worst-Case, Best-Case, and
Average-Case Efficiency
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Summary of the Analysis Framework
Both time and space efficiencies are measured as functions of input size.
Time efficiency is measured by counting the number of basic operations
executed in the algorithm. The space efficiency is measured by the
number of extra memory units consumed.
The framework’s primary interest lies in the order of growth of the
algorithm’s running time (space) as its input size goes infinity.
The efficiencies of some algorithms may differ significantly for inputs of
the same size. For these algorithms, we need to distinguish between the
worst-case, best-case and average case efficiencies.
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Asymptotic Growth Rate
These notations are mathematical way of representing the
time complexity.
Three notations used to compare orders of growth of an
algorithm’s basic operation count
 O(g(n)): class of functions f(n) that grow no faster than g(n)
 Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)
 Θ (g(n)): class of functions f(n) that grow at same rate as g(n)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
O-notation
O-notation is used to estimate worst case time complexity. i.e upper bound(at most time
taken by an algorithm)
Formal definition
 A function t(n) is said to be in O(g(n)), denoted t(n) O(g(n)), if t(n) is bounded above(upper
limit) by some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c
and some nonnegative integer n0 such that,
t(n)  c *g(n) for all n  n0
Exercises: prove the following using the above definition
 10n2
 O(n2
)
 10n2
+ 2n  O(n2
)
 100n + 5  O(n2
)
 5n+20  O(n)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
O-notation
Upper bound
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
-notation
 -notation is used to estimate best case time complexity. i.e lower bound(at
least time taken by an algorithm)
Formal definition
 A function t(n) is said to be in (g(n)), denoted t(n)  (g(n)), if t(n) is bounded
below(lower limit) by some constant multiple of g(n) for all large n, i.e., if there
exist some positive constant c and some nonnegative integer n0 such that
t(n)  c*g(n) for all n  n0
 It indicates g(n) is a lower bound and the runing time of an algorithm is always
greater than g(n).
Exercises: prove the following using the above definition
 10n2
 (n2
)
 10n2
+ 2n  (n2
)
 10n3
 (n2
)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
-notation
Lower bound
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
-notation
 -notation is used to estimate average case time complexity. i.e average
bound(lies between the upper bound and lower bound)
Formal definition
 A function t(n) is said to be in (g(n)), denoted t(n)  (g(n)), if t(n) is
bounded both above and below by some positive constant multiples of g(n) for
all large n, i.e., if there exist some positive constant c1 and c2 and some
nonnegative integer n0 such that
c2 . g(n)  t(n)  c1 . g(n) for all n  n0
 The upper bound of f(n) indicates that f(n) will not consume more than
specified time of c1*g(n) and in best case will consume at least the specified
time of c2*g(n).
Exercises: prove the following using the above definition
 10n2
 (n2
)
 10n2
+ 2n  (n2
)
 (1/2)n(n-1)  (n2
)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
(g(n)), functions that grow at least as fast as g(n)
(g(n)), functions that grow at the same rate as g(n)
O(g(n)), functions that grow no faster than g(n)
g(n)
>=
<=
=
Copyright  Li Zimao @ 2007-2008-1 SCUEC
-notation
Upper bound
Lower bound
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Some Properties of Asymptotic
Order of Growth
1. f(n)  O(f(n))
2. f(n)  O(g(n)) iff g(n) (f(n))
3. If f(n)  O(g(n)) and g(n)  O(h(n)) , then f(n)  O(h(n))
Note similarity with a ≤ b
4. If f1(n)  O(g1(n)) and f2(n)  O(g2(n)) , then
f1(n) + f2(n)  O(max{g1(n), g2(n)})
The analogous assertions are true for the -notation and -notation.
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Some Properties of Asymptotic
Order of Growth
If an algorithm has two executable parts, the analysis of
this algorithm can be obtained as below:
i.e If f1(n)  O(g1(n)) and f2(n)  O(g2(n)) , then
f1(n) + f2(n)  O(max{g1(n), g2(n)})
Implication: The algorithm’s overall efficiency will be
determined by the part with a larger order of growth.
 For example,
– 5n2
+ 3nlogn  O(n2
)
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Using Limits for Comparing Orders of Growth
limn→∞ T(n)/g(n) =
0 order of growth of T
T(
(n)
n) < order of growth of g(n)
g(n)
c>0 order of growth of T(n)
T(n) = order of growth of g(n)
g(n)
∞ order of growth of T(n)
T(n) > order of growth of g(n)
g(n)
We can obtain the order of growth (rather by using asymptotic notations) by computing the limit of the
ratio of two functions as below:
Copyright  Li Zimao @ 2007-2008-1 SCUEC
L’Hôpital’s rule
If limn→∞ f(n) = limn→∞ g(n) = ∞ and the derivatives f ´, g ´
exist, Then
f
f(
(n
n)
)
g
g(
(n
n)
)
lim
lim
n
n→∞
→∞
=
f
f ´(
´(n
n)
)
g
g ´(
´(n
n)
)
lim
lim
n
n→∞
→∞
Example:
Example:
• log
log2
2n
n vs.
vs. n
n
• 2
2n
n
vs. n
vs. n!
!
Stirling’s formula:
Stirling’s formula: n
n!
! 
 (2
(2
n
n)
)1/2
1/2
(
(n
n/e)
/e)n
n
Examples:
Examples:
• 10n vs. 2n2
• n(n+1)/2 vs. n2
• logb n vs. logc n
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Try
Compare the orders of growth of log
log2
2n
n and √n
√n
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Orders of growth of some
important functions
All logarithmic functions loga n belong to the same class
(log n) no matter what the logarithm’s base a > 1 is.
All polynomials of the same degree k belong to the same class:
aknk
+ ak-1nk-1
+ … + a0  (nk
).
Exponential functions an
have different orders of growth for
different a’s.
order log n < order n (>0) < order an
< order n! < order nn
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Basic Efficiency classes
1 constant
log n logarithmic
n linear
n log n n log n
n2
quadratic
n3
cubic
2n
exponential
n! factorial
fast
slow
High time efficiency
low time efficiency
The time
efficiencies of a
large number of
algorithms fall
into only a few
classes.
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Summary of How to Establish Orders of Growth
of an Algorithm’s Basic Operation Count
Method 1: Using limits.
 L’Hôpital’s rule
Method 2: Using the properties
Method 3: Using the definitions of O-, -, and
-notation.
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Mathematical Analysis of Time Efficiency of
Nonrecursive Algorithms
Steps for mathematical analysis of Nonrecursive
algorithms:
 Decide on parameter n indicating input size
 Identify algorithm’s basic operation
 Check whether the number of times the basic operation is
executed depends only on the input size n. If it is depending
also on the type of input rather only the size then, investigate
worst, average, and best case efficiency separately.
 Set up summation for C(n) reflecting the number of times the
algorithm’s basic operation is executed.
 Simplify summation using standard formulas.
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Example 1: Maximum element
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Example 2: Element uniqueness
problem
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
DIY: Matrix multiplication
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Mathematical Analysis of
Recursive Algorithms
Recursive evaluation of n!
Recursive solution to the Towers of Hanoi puzzle
Recursive solution to the number of binary digits
problem
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Steps in Mathematical Analysis of
Recursive Algorithms
Decide on parameter n indicating input size
Identify algorithm’s basic operation
Check whether the number of times the basic operation is executed may vary
on different inputs of the same size. (If it may, the worst, average, and best
cases must be investigated separately.)
Set up a recurrence relation and initial condition(s) for C(n)-the number of
times the basic operation will be executed for an input of size n (alternatively
count recursive calls).
Solve the recurrence or estimate the order of magnitude of the solution by
backward substitutions or another method (see Appendix B)
count=1;
fn(n)
{
if(n==1)
return 1;
else
{
count++;
return fn(n/2);
}
print count;
}
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Fibonacci numbers
The Fibonacci numbers:
0, 1, 1, 2, 3, 5, 8, 13, 21, …
The Fibonacci recurrence:
F(n) = F(n-1) + F(n-2)
F(0) = 0
F(1) = 1
General 2nd
order linear homogeneous recurrence with
constant coefficients:
aX(n) + bX(n-1) + cX(n-2) = 0
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Application to the Fibonacci
numbers
F(n) = F(n-1) + F(n-2) or F(n) - F(n-1) - F(n-2) = 0
Characteristic equation:
Roots of the characteristic equation:
General solution to the recurrence:
Particular solution for F(0) =0, F(1)=1:
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Golden Ratio
ratio
golden
as
known
is
61803
.
1
2
5
1
integer
nearest
the
to
rounded
5
1
)
(
)
ˆ
(
5
1
)
2
5
1
(
5
1
)
2
5
1
(
5
1
)
(














n
n
n
n
n
n
F
n
F
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Copyright  Li Zimao @ 2007-2008-1 SCUEC
Thank you

3. analysis of algorithms efficiency.ppt

  • 1.
    Fundamentals of theAnalysis of Algorithm Efficiency  Algorithm analysis framework  Asymptotic notations  Analysis of non-recursive algorithms  Analysis of recursive algorithms
  • 2.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Analysis of Algorithms Analysis of algorithms means to investigate an algorithm’s efficiency with respect to resources: running time and memory space. Time efficiency: how fast an algorithm runs. Space efficiency: the space an algorithm requires.
  • 3.
    Space and Timecomplexity Components that affect space complexity: •Program space •Data space •Stack space Components that affect time complexity: •Speed of the computer •Choice of prog. Language •Compiler used •Choice of algorithm •No. of inputs/outputs •Size of inputs Note: since we have no control on 1,2,3 lets concern about 4,5,6 Copyright  Li Zimao @ 2007-2008-1 SCUEC
  • 4.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Analysis Framework
  • 5.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Measuring Input Sizes • Typically, algorithms run longer as the size of its input increases. • We are interested in knowing how efficiency scales w.r.t input size. • Time efficiency of an algorithm depends on input size n and hence it is expressed in terms of n. • Input size depends on the problem.  Example 1, what is the input size of the problem of sorting n numbers?  Example 2, what is the input size of adding two n by n matrices?
  • 6.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Units for Measuring Running Time Should we measure the running time using standard unit of time measurements, such as seconds, minutes?  Depends on the speed of the computer. Given the input size of an algorithm, the time efficiency is normally computed by considering the basic operation. Count the number of times an algorithm’s basic operation is executed.  Basic operation: the operation that contributes the most to the total running time.  For example, the basic operation is usually the most time-consuming operation in the algorithm’s innermost loop.
  • 7.
    Ex1 : Copyright Li Zimao @ 2007-2008-1 SCUEC
  • 8.
    Ex2: Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 9.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Input size and basic operation examples Problem Input size measure Basic operation Searching for key in a list of n items Number of list’s items, i.e. n Key comparison Multiplication of two matrices Matrix dimensions or total number of elements Multiplication of two numbers Checking primality of a given integer n n’size = number of digits (in binary representation) Division Typical graph problem #vertices and/or edges Visiting a vertex or traversing an edge
  • 10.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Theoretical Analysis of Time Efficiency Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size. The efficiency analysis framework ignores the multiplicative constants of C(n) and focuses on the orders of growth of the C(n).
  • 11.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Order of growth Most important: Order of growth within a constant multiple as n→∞ Some algo. works faster for smaller input size ‘n’ and becomes slow as ‘n’ increases. This kind of change in the behavior of an algo. as the value of n increases is called Order of growth. Example:  How much faster will algorithm run on computer that is twice as fast?  How much longer does it take to solve problem of double input size?
  • 12.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Orders of Growth Orders of growth: • consider only the leading term of a formula • ignore the constant coefficient. Exponential-growth functions
  • 13.
    Copyright  LiZimao @ 2007-2008-1 SCUEC 0 100 200 300 400 500 600 700 1 2 3 4 5 6 7 8 n*n*n n*n n log(n) n log(n)
  • 14.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Worst-Case, Best-Case, and Average- Case Efficiency Algorithm efficiency depends on the input size n For some algorithms efficiency depends on type of input.  Example: Sequential Search – Problem: Given a list of n elements and a search key K, find an element equal to K, if any. – Algorithm: Scan the list and compare its successive elements with K until either a matching element is found (successful search) or the list is exhausted (unsuccessful search) Given a sequential search problem of an input size of n, what kind of input would make the running time the longest? How many key comparisons?
  • 15.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Sequential Search Algorithm ALGORITHM SequentialSearch(A[0..n-1], K) //Searches for a given value in a given array by sequential search //Input: An array A[0..n-1] and a search key K //Output: Returns the index of the first element of A that matches K or –1 if there are no matching elements i 0 while i < n and A[i] ‡ K do i  i + 1 if i < n //A[I] = K return i else return -1
  • 16.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Worst case Efficiency(at most time)  Efficiency (# of times the basic operation will be executed) for the worst case input of size n.  The algorithm runs the longest among all possible inputs of size n. Best case(at least time)  Efficiency (# of times the basic operation will be executed) for the best case input of size n.  The algorithm runs the fastest among all possible inputs of size n. Average case: (average)  Efficiency (#of times the basic operation will be executed) for a typical/random input of size n.  NOT the average of worst and best case  How to find the average case efficiency? Worst-Case, Best-Case, and Average-Case Efficiency
  • 17.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Summary of the Analysis Framework Both time and space efficiencies are measured as functions of input size. Time efficiency is measured by counting the number of basic operations executed in the algorithm. The space efficiency is measured by the number of extra memory units consumed. The framework’s primary interest lies in the order of growth of the algorithm’s running time (space) as its input size goes infinity. The efficiencies of some algorithms may differ significantly for inputs of the same size. For these algorithms, we need to distinguish between the worst-case, best-case and average case efficiencies.
  • 18.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Asymptotic Growth Rate These notations are mathematical way of representing the time complexity. Three notations used to compare orders of growth of an algorithm’s basic operation count  O(g(n)): class of functions f(n) that grow no faster than g(n)  Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)  Θ (g(n)): class of functions f(n) that grow at same rate as g(n)
  • 19.
    Copyright  LiZimao @ 2007-2008-1 SCUEC O-notation O-notation is used to estimate worst case time complexity. i.e upper bound(at most time taken by an algorithm) Formal definition  A function t(n) is said to be in O(g(n)), denoted t(n) O(g(n)), if t(n) is bounded above(upper limit) by some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some nonnegative integer n0 such that, t(n)  c *g(n) for all n  n0 Exercises: prove the following using the above definition  10n2  O(n2 )  10n2 + 2n  O(n2 )  100n + 5  O(n2 )  5n+20  O(n)
  • 20.
    Copyright  LiZimao @ 2007-2008-1 SCUEC O-notation Upper bound
  • 21.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 22.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 23.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 24.
    Copyright  LiZimao @ 2007-2008-1 SCUEC -notation  -notation is used to estimate best case time complexity. i.e lower bound(at least time taken by an algorithm) Formal definition  A function t(n) is said to be in (g(n)), denoted t(n)  (g(n)), if t(n) is bounded below(lower limit) by some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some nonnegative integer n0 such that t(n)  c*g(n) for all n  n0  It indicates g(n) is a lower bound and the runing time of an algorithm is always greater than g(n). Exercises: prove the following using the above definition  10n2  (n2 )  10n2 + 2n  (n2 )  10n3  (n2 )
  • 25.
    Copyright  LiZimao @ 2007-2008-1 SCUEC -notation Lower bound
  • 26.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 27.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 28.
    Copyright  LiZimao @ 2007-2008-1 SCUEC -notation  -notation is used to estimate average case time complexity. i.e average bound(lies between the upper bound and lower bound) Formal definition  A function t(n) is said to be in (g(n)), denoted t(n)  (g(n)), if t(n) is bounded both above and below by some positive constant multiples of g(n) for all large n, i.e., if there exist some positive constant c1 and c2 and some nonnegative integer n0 such that c2 . g(n)  t(n)  c1 . g(n) for all n  n0  The upper bound of f(n) indicates that f(n) will not consume more than specified time of c1*g(n) and in best case will consume at least the specified time of c2*g(n). Exercises: prove the following using the above definition  10n2  (n2 )  10n2 + 2n  (n2 )  (1/2)n(n-1)  (n2 )
  • 29.
    Copyright  LiZimao @ 2007-2008-1 SCUEC (g(n)), functions that grow at least as fast as g(n) (g(n)), functions that grow at the same rate as g(n) O(g(n)), functions that grow no faster than g(n) g(n) >= <= =
  • 30.
    Copyright  LiZimao @ 2007-2008-1 SCUEC -notation Upper bound Lower bound
  • 31.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 32.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Some Properties of Asymptotic Order of Growth 1. f(n)  O(f(n)) 2. f(n)  O(g(n)) iff g(n) (f(n)) 3. If f(n)  O(g(n)) and g(n)  O(h(n)) , then f(n)  O(h(n)) Note similarity with a ≤ b 4. If f1(n)  O(g1(n)) and f2(n)  O(g2(n)) , then f1(n) + f2(n)  O(max{g1(n), g2(n)}) The analogous assertions are true for the -notation and -notation.
  • 33.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Some Properties of Asymptotic Order of Growth If an algorithm has two executable parts, the analysis of this algorithm can be obtained as below: i.e If f1(n)  O(g1(n)) and f2(n)  O(g2(n)) , then f1(n) + f2(n)  O(max{g1(n), g2(n)}) Implication: The algorithm’s overall efficiency will be determined by the part with a larger order of growth.  For example, – 5n2 + 3nlogn  O(n2 )
  • 34.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 35.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Using Limits for Comparing Orders of Growth limn→∞ T(n)/g(n) = 0 order of growth of T T( (n) n) < order of growth of g(n) g(n) c>0 order of growth of T(n) T(n) = order of growth of g(n) g(n) ∞ order of growth of T(n) T(n) > order of growth of g(n) g(n) We can obtain the order of growth (rather by using asymptotic notations) by computing the limit of the ratio of two functions as below:
  • 36.
    Copyright  LiZimao @ 2007-2008-1 SCUEC L’Hôpital’s rule If limn→∞ f(n) = limn→∞ g(n) = ∞ and the derivatives f ´, g ´ exist, Then f f( (n n) ) g g( (n n) ) lim lim n n→∞ →∞ = f f ´( ´(n n) ) g g ´( ´(n n) ) lim lim n n→∞ →∞ Example: Example: • log log2 2n n vs. vs. n n • 2 2n n vs. n vs. n! ! Stirling’s formula: Stirling’s formula: n n! !   (2 (2 n n) )1/2 1/2 ( (n n/e) /e)n n Examples: Examples: • 10n vs. 2n2 • n(n+1)/2 vs. n2 • logb n vs. logc n
  • 37.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 38.
    Try Compare the ordersof growth of log log2 2n n and √n √n Copyright  Li Zimao @ 2007-2008-1 SCUEC
  • 39.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Orders of growth of some important functions All logarithmic functions loga n belong to the same class (log n) no matter what the logarithm’s base a > 1 is. All polynomials of the same degree k belong to the same class: aknk + ak-1nk-1 + … + a0  (nk ). Exponential functions an have different orders of growth for different a’s. order log n < order n (>0) < order an < order n! < order nn
  • 40.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Basic Efficiency classes 1 constant log n logarithmic n linear n log n n log n n2 quadratic n3 cubic 2n exponential n! factorial fast slow High time efficiency low time efficiency The time efficiencies of a large number of algorithms fall into only a few classes.
  • 41.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Summary of How to Establish Orders of Growth of an Algorithm’s Basic Operation Count Method 1: Using limits.  L’Hôpital’s rule Method 2: Using the properties Method 3: Using the definitions of O-, -, and -notation.
  • 42.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Mathematical Analysis of Time Efficiency of Nonrecursive Algorithms Steps for mathematical analysis of Nonrecursive algorithms:  Decide on parameter n indicating input size  Identify algorithm’s basic operation  Check whether the number of times the basic operation is executed depends only on the input size n. If it is depending also on the type of input rather only the size then, investigate worst, average, and best case efficiency separately.  Set up summation for C(n) reflecting the number of times the algorithm’s basic operation is executed.  Simplify summation using standard formulas.
  • 43.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Example 1: Maximum element
  • 44.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 45.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 46.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Example 2: Element uniqueness problem
  • 48.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 49.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 50.
    Copyright  LiZimao @ 2007-2008-1 SCUEC DIY: Matrix multiplication
  • 51.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 52.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 53.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 54.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Mathematical Analysis of Recursive Algorithms Recursive evaluation of n! Recursive solution to the Towers of Hanoi puzzle Recursive solution to the number of binary digits problem
  • 55.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Steps in Mathematical Analysis of Recursive Algorithms Decide on parameter n indicating input size Identify algorithm’s basic operation Check whether the number of times the basic operation is executed may vary on different inputs of the same size. (If it may, the worst, average, and best cases must be investigated separately.) Set up a recurrence relation and initial condition(s) for C(n)-the number of times the basic operation will be executed for an input of size n (alternatively count recursive calls). Solve the recurrence or estimate the order of magnitude of the solution by backward substitutions or another method (see Appendix B)
  • 63.
  • 64.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Fibonacci numbers The Fibonacci numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, … The Fibonacci recurrence: F(n) = F(n-1) + F(n-2) F(0) = 0 F(1) = 1 General 2nd order linear homogeneous recurrence with constant coefficients: aX(n) + bX(n-1) + cX(n-2) = 0
  • 65.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 66.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 67.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 68.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Application to the Fibonacci numbers F(n) = F(n-1) + F(n-2) or F(n) - F(n-1) - F(n-2) = 0 Characteristic equation: Roots of the characteristic equation: General solution to the recurrence: Particular solution for F(0) =0, F(1)=1:
  • 69.
    Copyright  LiZimao @ 2007-2008-1 SCUEC Golden Ratio ratio golden as known is 61803 . 1 2 5 1 integer nearest the to rounded 5 1 ) ( ) ˆ ( 5 1 ) 2 5 1 ( 5 1 ) 2 5 1 ( 5 1 ) (               n n n n n n F n F
  • 70.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 71.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 72.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 73.
    Copyright  LiZimao @ 2007-2008-1 SCUEC
  • 74.

Editor's Notes

  • #11 Example: cn2 how much faster on twice as fast computer? (2) how much longer for 2n? (4)