Recurrence Relations and Big
O
How do we compute the costs of running our code
Recursive Functions (Math Ones)
• Similar to coding, a recursive function in math is something that calls
itself.
• We have a base case(s), and the recursive state.
• Example: T(1)=1 ,T(2)=1, Tn= T(n-1)+T(n-2) for n>=3
• A pretty comprehensive topic. However we want to focus on Software
Engineering, so we will look at them wrt to understanding asymptotic
complexities.
Building it (and use)
• We can take the code. Any base case in the base-case directly
translates as a base case in our math formulation.
• Our recursive case translates directly as well.
• We substitute values till we hit a base case.
• Add the number of operations.
• OR we draw a tree.
• And We add. They are the same process, so it depends on what you
prefer.
Example 1: T(1)=1 ,T(2)=1, Tn= T(n-1)+T(n-2)
for n>=3. Calculating complexity
• T(n)= T(n-1)+T(n-2)
• = T(n-2)+T(n-3)+T(n-3)+T(n-4)
• = …
• We have constantly substituted the recurrence for one lower.
Eventually we want to get to base case.
• We can see a pattern form. Each higher level function call splits into 2
lower level ones (till we hit the base case). T(n)-> T(n-1) +T(n-2);
T(n-1)->T(n-2)+T(n-3) … T(n-(n-3))->T(2) +T(1) where we hit basecase
Example 1: Sketching our tree
Example 1: Code
• function F1(n){
• if(n==1 || n==2){
• return 1
• }else{
• return F1(n-1)+F1(n-2)
• }
• }
Calculating the Time Complexity (COST)
• We can see that the function calls itself Twice per recursive call (our
sketch of this pattern has 2 branches from every level).
• So the number of calls for T(n) is 2*T(n-1). NOTE: We are talking
about number of calls not cost.
• Each function call has 1 operation (the addition). Base cases have the
return.
• So how many calls do we have??
• What is the cost? Can you think of a generalized formula
Big O analysis
• Definition: f(n) = O(g(n)) if there exists a positive integer n0 and a
positive constant c, such that f(n)≤c*g(n) ∀ n≥n0
• It states the following. Imagine we have 2 functions f and g. f is in the
Big O of g if we can find an n0 and c such that f(n)≤c*g(n) ∀ n≥n0.
• Essentially, we can say that f is in the Big O of n if we can show that g
will be greater than f for all values more than n.
Simple way to calculate Costs
• Number of Function Calls * Operations per call (general outline for
calculations).
• This is where understanding Recurrence Relationships come into play
• RRs can help you strip the code down into essentials. Helpful for
figuring out function calls.
• Think of the problems we’ve already done. What do you think of their
costs? How would you split their number of function calls and
operations per call?
Your Interesting Question
• https://www.hackerrank.com/challenges/jumping-on-the-
clouds/problem
• Try to break down this problem into cases and solve.
• Lmk if you need help with something

Recurrence relationships

  • 1.
    Recurrence Relations andBig O How do we compute the costs of running our code
  • 2.
    Recursive Functions (MathOnes) • Similar to coding, a recursive function in math is something that calls itself. • We have a base case(s), and the recursive state. • Example: T(1)=1 ,T(2)=1, Tn= T(n-1)+T(n-2) for n>=3 • A pretty comprehensive topic. However we want to focus on Software Engineering, so we will look at them wrt to understanding asymptotic complexities.
  • 3.
    Building it (anduse) • We can take the code. Any base case in the base-case directly translates as a base case in our math formulation. • Our recursive case translates directly as well. • We substitute values till we hit a base case. • Add the number of operations. • OR we draw a tree. • And We add. They are the same process, so it depends on what you prefer.
  • 4.
    Example 1: T(1)=1,T(2)=1, Tn= T(n-1)+T(n-2) for n>=3. Calculating complexity • T(n)= T(n-1)+T(n-2) • = T(n-2)+T(n-3)+T(n-3)+T(n-4) • = … • We have constantly substituted the recurrence for one lower. Eventually we want to get to base case. • We can see a pattern form. Each higher level function call splits into 2 lower level ones (till we hit the base case). T(n)-> T(n-1) +T(n-2); T(n-1)->T(n-2)+T(n-3) … T(n-(n-3))->T(2) +T(1) where we hit basecase
  • 5.
  • 6.
    Example 1: Code •function F1(n){ • if(n==1 || n==2){ • return 1 • }else{ • return F1(n-1)+F1(n-2) • } • }
  • 7.
    Calculating the TimeComplexity (COST) • We can see that the function calls itself Twice per recursive call (our sketch of this pattern has 2 branches from every level). • So the number of calls for T(n) is 2*T(n-1). NOTE: We are talking about number of calls not cost. • Each function call has 1 operation (the addition). Base cases have the return. • So how many calls do we have?? • What is the cost? Can you think of a generalized formula
  • 8.
    Big O analysis •Definition: f(n) = O(g(n)) if there exists a positive integer n0 and a positive constant c, such that f(n)≤c*g(n) ∀ n≥n0 • It states the following. Imagine we have 2 functions f and g. f is in the Big O of g if we can find an n0 and c such that f(n)≤c*g(n) ∀ n≥n0. • Essentially, we can say that f is in the Big O of n if we can show that g will be greater than f for all values more than n.
  • 9.
    Simple way tocalculate Costs • Number of Function Calls * Operations per call (general outline for calculations). • This is where understanding Recurrence Relationships come into play • RRs can help you strip the code down into essentials. Helpful for figuring out function calls. • Think of the problems we’ve already done. What do you think of their costs? How would you split their number of function calls and operations per call?
  • 10.
    Your Interesting Question •https://www.hackerrank.com/challenges/jumping-on-the- clouds/problem • Try to break down this problem into cases and solve. • Lmk if you need help with something