<Day1_Session1>
Analysis of Algorithms
(Time and Space Complexity)
Why Analysis of Algorithm ?
One problem multiple solution (or algorithm)
One single problem can have multiple solution depending on the factor that
what data structure we are choosing, what types of input we are getting and
much more. .
How to decide which algorithm to choose ?
Big Scenario : A complex problem (made up of smaller problems). Each
smaller problems has more than one solution .
How to decide : Which algorithm is best for particular problem ?
Try to run different Algorithm and see which one is better ?
● Intel Dual core v/s Intel i5 11th gen
● Time consuming
Now we all agree
Algorithm analysis helps us to determine which algorithm is most efficient in
terms of time and space consumed (not exactly but precisely and symbolically).
How to Compare Algorithms?
Execution times: Core 2 duo vs i5
Number of statements executed: Not a good measure, since the number of statements varies
with the programming language as well as the style of the individual programmer.
Ideal solution: Let us assume that we express the running time of a given algorithm as a
function of the input size n (i.e., f(n)) and compare these different functions corresponding
to running times. This kind of comparison is independent of machine time, programming
style, etc.
i.e., analysing rate of growth of function
Similarly we can express auxiliary space used in term of a function.
Rate of growth of function
The rate at which the running time increases as a function of input is called rate of
growth.
Example,
Total cost = cost of lamborghini + cost of nano
Total cost = approx. cost of lamborghini
As an example, in the case below, n^4 , 2*n^2 , 100n and 500 are the individual
costs of some function and approximate to n^4 since n^4 is the highest rate of
growth.
f(n) = n^4 + 2*n^2 + 100n + 500 ~ O(n^4)
Big-O Notation (Upper Bounding Function)
f(n) = O(g(n))
That means, at larger values of n, the upper bound of f(n) is g(n).
O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c*g(n) for all n
> n0}.
g(n) is an asymptotic tight upper bound for f(n).
Generally we discard lower values of n. That means the rate of growth at lower values of n is
not important. In the figure, n0 is the point from which we need to consider the rate of
growth for a given algorithm. Below n0 , the rate of growth could be different. n0 is called
threshold for the given function.
Questions
1. f(n) = 3n + 8
2. f(n) = n^2 + 1
3. f(n) = n^4 + 100n^2 + 50
4. f(n) = n^2 = O(n^4)
Binary Search
Binary Search v/s Ternary Search - A keynote
log 2 n ? log 3 n
Best Case, Average Case, Worst Case
Omega-Q Notation (Lower Bounding Function)
This notation gives the tighter lower bound of the given algorithm and we represent it as
f(n) = Ω(g(n)). That means, at larger values of n, the tighter lower bound of f(n) is g(n).
For example, if
f(n) = 100n^2 + 10n + 50, g(n) is Ω(n^2 )
Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all
n ≥ n0}.
g(n) is an asymptotic tight lower bound for f(n)}
Questions
1. f(n) = n^2 + n
2. f(n) = n^3 + logn
3. f(n) = n^3 != Q(logn)
4. f(n) = n^2 = Q(n^2) = O(n^2)
Theta-Θ Notation (Order Function)
The Theta-notation asymptotically bounds a function from above and below.
Θ(g(n)) = {f(n): there exist positive constants c1 ,c2 and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤
c2g(n) for all n ≥ n0}.
g(n) is an asymptotic tight bound for f(n).
Questions
Now solve this:
If we have recurrence, T(n) = aT(n/b) + Θ(nk logpn),
where a>=1 , b>1 , k>=0 and p is a real number then:
Questions:
More Questions:
1. f(n) = 100n + 100 , g(n) = n^2:
a) f(n) = O(g(n))
b) f(n) = Q(g(n))
c) g(n) = Q(f(n))
d) g(n) = O(f(n))
1. f(n) = n^2 , g(n) = 2^n , h(n) = n^3:
a) f(n) = O(g(n))
b) h(n) = O(g(n))
c) h(n) = O(h(n))
</Day1_Session1>

Weekends with Competitive Programming

  • 1.
  • 2.
    Analysis of Algorithms (Timeand Space Complexity)
  • 3.
    Why Analysis ofAlgorithm ? One problem multiple solution (or algorithm) One single problem can have multiple solution depending on the factor that what data structure we are choosing, what types of input we are getting and much more. . How to decide which algorithm to choose ? Big Scenario : A complex problem (made up of smaller problems). Each smaller problems has more than one solution . How to decide : Which algorithm is best for particular problem ?
  • 4.
    Try to rundifferent Algorithm and see which one is better ? ● Intel Dual core v/s Intel i5 11th gen ● Time consuming Now we all agree Algorithm analysis helps us to determine which algorithm is most efficient in terms of time and space consumed (not exactly but precisely and symbolically).
  • 5.
    How to CompareAlgorithms? Execution times: Core 2 duo vs i5 Number of statements executed: Not a good measure, since the number of statements varies with the programming language as well as the style of the individual programmer. Ideal solution: Let us assume that we express the running time of a given algorithm as a function of the input size n (i.e., f(n)) and compare these different functions corresponding to running times. This kind of comparison is independent of machine time, programming style, etc. i.e., analysing rate of growth of function Similarly we can express auxiliary space used in term of a function.
  • 6.
    Rate of growthof function The rate at which the running time increases as a function of input is called rate of growth. Example, Total cost = cost of lamborghini + cost of nano Total cost = approx. cost of lamborghini As an example, in the case below, n^4 , 2*n^2 , 100n and 500 are the individual costs of some function and approximate to n^4 since n^4 is the highest rate of growth. f(n) = n^4 + 2*n^2 + 100n + 500 ~ O(n^4)
  • 8.
    Big-O Notation (UpperBounding Function) f(n) = O(g(n)) That means, at larger values of n, the upper bound of f(n) is g(n). O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c*g(n) for all n > n0}. g(n) is an asymptotic tight upper bound for f(n). Generally we discard lower values of n. That means the rate of growth at lower values of n is not important. In the figure, n0 is the point from which we need to consider the rate of growth for a given algorithm. Below n0 , the rate of growth could be different. n0 is called threshold for the given function.
  • 9.
    Questions 1. f(n) =3n + 8 2. f(n) = n^2 + 1 3. f(n) = n^4 + 100n^2 + 50 4. f(n) = n^2 = O(n^4)
  • 11.
    Binary Search Binary Searchv/s Ternary Search - A keynote log 2 n ? log 3 n
  • 12.
    Best Case, AverageCase, Worst Case
  • 13.
    Omega-Q Notation (LowerBounding Function) This notation gives the tighter lower bound of the given algorithm and we represent it as f(n) = Ω(g(n)). That means, at larger values of n, the tighter lower bound of f(n) is g(n). For example, if f(n) = 100n^2 + 10n + 50, g(n) is Ω(n^2 ) Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0}. g(n) is an asymptotic tight lower bound for f(n)}
  • 14.
    Questions 1. f(n) =n^2 + n 2. f(n) = n^3 + logn 3. f(n) = n^3 != Q(logn) 4. f(n) = n^2 = Q(n^2) = O(n^2)
  • 15.
    Theta-Θ Notation (OrderFunction) The Theta-notation asymptotically bounds a function from above and below. Θ(g(n)) = {f(n): there exist positive constants c1 ,c2 and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}. g(n) is an asymptotic tight bound for f(n).
  • 16.
  • 17.
  • 18.
    If we haverecurrence, T(n) = aT(n/b) + Θ(nk logpn), where a>=1 , b>1 , k>=0 and p is a real number then:
  • 19.
    Questions: More Questions: 1. f(n)= 100n + 100 , g(n) = n^2: a) f(n) = O(g(n)) b) f(n) = Q(g(n)) c) g(n) = Q(f(n)) d) g(n) = O(f(n)) 1. f(n) = n^2 , g(n) = 2^n , h(n) = n^3: a) f(n) = O(g(n)) b) h(n) = O(g(n)) c) h(n) = O(h(n))
  • 20.