• As mentioned in lecture 1, growth rates of algorithms are important: • how fast do resources needed for an algorithm grow with respect to the size of the problem? Adnan Shahzada• What we are primarily interested in is the asymptotic growth rate • how fast the resources grow when the problem size is extremely large • more specifically, the ratio of computational resources for problems of size n and n+1 as n
• The reason why we are interested in asymptotic growth rates is for two reasons:• For practical purposes, large problems are when we expect to have big computational requirements Adnan Shahzada• For theoretical purposes, concentrating on growth rates frees us from some important issues: • fixed costs (e.g. switching the computer on!), which may dominate for a small problem size but be largely irrelevant • machine and implementation details • The growth rate will be a compact and easy to understand function
• These notations are used to describe the asymptotic running time of an algorithm.• Asymptotic Time Complexity: Adnan Shahzada The limiting behavior of the execution time of an algorithm when the size of the problem goes to infinity.• Asymptotic Space Complexity” The limiting behavior of the use of memory space of an algorithm when the size of the problem goes to infinity.
• Consider two functions • t(n): 0 • f(n): 0 Adnan Shahzada• We say that t(n) is in the order of f(n) if: • t(n) is bounded above by f(n) for all sufficiently large n • i.e. t(n) cf(n), where n n n0 • where c is a positive constant • n0 is an integer threshold
• If t(n) is in the order of f(n) we write this as: • t(n) is in O(f(n))• Often you will see things written like: Adnan Shahzada • t(n) = O(f(n))• Strictly speaking this use of the equals sign is incorrect • the relationship is a set inclusion, not an equality • t(n) O(f(n)) is better
• Example1: 2n + 10 is O(n) • 2n + 10 cn • (c 2) n 10 • n 10/(c 2) Adnan Shahzada • Pick c = 3 and n0 = 10• Example2: 3n3 + 20n2 + 5 is O(n3) — need c > 0 and n0 1 such that: — 3n3 + 20n2 + 5 c•n3 for n n0 — this is true for c = 4 and n0 = 20
• Rules: 1.Drop lower-order terms 2.Drop constant factors Adnan Shahzada• Use the smallest possible class of functions • Say “2n is O(n)” instead of “2n is O(n2)”• Use the simplest expression of the class • Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
• The maximum rule states that• O(f(n)+g(n)) = O( max(f(n),g(n)) )• The term with the largest order dominates• Proof: Adnan Shahzada f(n)+g(n) = max(f(n),g(n)) + min(f(n),g(n)) f(n)+g(n) 2max(f(n),g(n)) • The constant of 2 can be easily
• Seven functions that often appear in algorithm analysis: • Constant 1 • Logarithmic log n • Linear n Adnan Shahzada • N-Log-N n log n • Quadratic n2 • Cubic n3 • Exponential 2n• Examples??
• c (constant)• log n•n Adnan Shahzada• nlog n• nk (obviously n2 is lower than n3 and so on)• kn• n!• nn
• Big-O represents an upper bound on algorithmic cost• Big- represents a lower bound on algorithmic cost• Def: t(n) (f(n)) t(n) cf(n), where n n n0 Adnan Shahzada• 3n +2 = (n) , with c = 2 and n0 = 2• 3n3 +3n -1 is (n3), with c=2 and n0 =2• Note the duality rule: t(n)(f(n)) f(n)O(t(n))
-Notation• Asymptotic Tight Bound• A function f(n) is (g(n)) if there exist positive constants c1 , c2 , and n0 such that • 0 c1g(n) f(n) c2 g(n) Adnan Shahzada• When a problem is (n), this represents both an upper and lower bound • i.e. it is O(n) and (n) • we say there is no algorithmic gap• Is 3n + 2 = (n) ??
• f(n) = o(g(n)) means f(n) becomes insignificant relative to g(n) as n approaches to infinity. Adnan Shahzada• f(n) = o(g(n)) means for all c > 0 there exists some n0 > 0 such that 0 <=f(n) < cg(n) for all n>n0• Is 7n = o(n2) ?
• f(n) = ω(g(n)) means that for any positive constant c>, there exists a constant n0>0, such that 0 <= cg(n) < f(n) for all n>=n0• f(n) = o(g(n)) means for all c > 0 there exists some n0 > 0 such that 0 <=f(n) < cg(n) for all n>n0 Adnan Shahzada• Is 7n2 = ω(n) ?
• If two algorithms have the same asymptotic complexity say O(n2), will the execution time of the two algorithms always be same? Adnan Shahzada• How to select between the two algorithms having the same asymptotic performance?
• A common misconception is that worst case running time is somehow defined by big-Oh, and that best case is defined by big-Omega.• There is no formal relationship like this. Adnan Shahzada• However, worst case and big-Oh are commonly used together, because they are both techniques for finding an upper bound on running time.