Analysis of Algorithms                         Big-Oh                         Courtesy : Dale Roberts
Asymptotic Analysis    Ignoring constants in T(n)    Analyzing T(n) as n "gets large"     Example:                    T (n...
3 major notationsΟ(g(n)), Big-Oh of g of n, the Asymptotic UpperBound.Ω (g(n)), Big-Omega of g of n, the AsymptoticLower B...
Big-Oh Defined   The O symbol was introduced in 1927 to indicate relative growth of    two functions based on asymptotic ...
Big-OhDescribes an upper bound for the runningtime of an algorithmUpper bounds for Insertion Sort running times: •worst ca...
Big-O NotationWe say Insertion Sort’s run time is O(n2)  Properly we should say run time is in O(n2)  Read O as “Big-Oh” (...
Big-Oh PropertiesFastest growing function dominates a sum  O(f(n)+g(n)) is O(max{f(n), g(n)})Product of upper bounds is up...
Some Big-Oh’s are not reasonablePolynomial Time algorithms  An algorithm is said to be polynomial if it is  O( nc ), c >1 ...
Can we justify Big O notation?Big O notation is a huge simplification; can wejustify it?    It only makes sense for large ...
Classifying Algorithms based on Big-OhA function f(n) is said to be of at most logarithmic growth if f(n) =O(log n)A funct...
Rules for Calculating Big-OhBase of Logs ignored logan = O(logbn)Power inside logs ignored log(n2) = O(log n)Base and powe...
Big-Oh Examples2n3 + 3n2 + n =              =   2n3 + 3n2   + O(n)              =   2n3 + O(    n2 + n)              =   2...
Big-Oh Examples (cont.)3. Suppose a program P is O(n3), and a program Q  is O(3n), and that currently both can solve probl...
Big-Oh Examples (cont)n3 = 503 ∗ 729                3n = 350 ∗ 729n = 3 503 * 3 729             n = log3 (729 ∗ 350)      ...
AcknowledgementsPhiladephia University, JordanNilagupta, Pradondet                                 Courtesy : Dale Roberts
Upcoming SlideShare
Loading in …5
×

Analysis of algo

1,496 views

Published on

Published in: Education, Technology, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,496
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
65
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Analysis of algo

  1. 1. Analysis of Algorithms Big-Oh Courtesy : Dale Roberts
  2. 2. Asymptotic Analysis Ignoring constants in T(n) Analyzing T(n) as n "gets large" Example: T (n) = 13n + 42n + 2n log n + 4n 3 2As n grows larger, n3 is MUCH larger than n2 , n log n, and n,so it dominates T (n)The running time grows "roughly on the order of n3 "Notationally, T(n) = O(n3 ) The big-oh (O) Notation Courtesy : Dale Roberts
  3. 3. 3 major notationsΟ(g(n)), Big-Oh of g of n, the Asymptotic UpperBound.Ω (g(n)), Big-Omega of g of n, the AsymptoticLower Bound.Θ (g(n)), Big-Theta of g of n, the AsymptoticTight Bound. Courtesy : Dale Roberts
  4. 4. Big-Oh Defined The O symbol was introduced in 1927 to indicate relative growth of two functions based on asymptotic behavior of the functions now us ed to classify functions and families of functionsT(n) = O(f(n)) if there are constants c and n0 such that T(n) < c*f(n) when n ≥ n0 c*f(n) c*f(n) is an upper bound for T(n) T(n) n0 n Courtesy : Dale Roberts
  5. 5. Big-OhDescribes an upper bound for the runningtime of an algorithmUpper bounds for Insertion Sort running times: •worst case: O(n2) T(n) = c1*n2 + c2*n + c3 •best case: O(n) T(n) = c1*n + c2 Time Complexity Courtesy : Dale Roberts
  6. 6. Big-O NotationWe say Insertion Sort’s run time is O(n2) Properly we should say run time is in O(n2) Read O as “Big-Oh” (you’ll also hear it as “order”)In general a function f(n) is O(g(n)) if there exist positive constants c and n0 such that f(n) ≤ c ⋅ g(n) for all n ≥ n0e.g. if f(n)=1000n and g(n)=n2, n0 = 1000 and c =1 then f(n) < 1*g(n) where n > n0 and we say thatf(n) = O(g(n))The O notation indicates bounded above by aconstant multiple of. Courtesy : Dale Roberts
  7. 7. Big-Oh PropertiesFastest growing function dominates a sum O(f(n)+g(n)) is O(max{f(n), g(n)})Product of upper bounds is upper bound for the product If f is O(g) and h is O(r) then fh is O(gr)f is O(g) is transitive If f is O(g) and g is O(h) then f is O(h)Hierarchy of functions O(1), O(logn), O(n1/2), O(nlogn), O(n2), O(2n), O(n!) Courtesy : Dale Roberts
  8. 8. Some Big-Oh’s are not reasonablePolynomial Time algorithms An algorithm is said to be polynomial if it is O( nc ), c >1 Polynomial algorithms are said to be reasonable They solve problems in reasonable times! Coefficients, constants or low-order terms are ignored e.g. if f(n) = 2n2 then f(n) = O(n2)Exponential Time algorithms An algorithm is said to be exponential if it is O( rn ), r > 1 Exponential algorithms are said to be unreasonable Courtesy : Dale Roberts
  9. 9. Can we justify Big O notation?Big O notation is a huge simplification; can wejustify it? It only makes sense for large problem sizes For sufficiently large problem sizes, the highest-order term swamps all the rest!Consider R = x 2 + 3x + 5 as x varies:x = 0 x2 = 0 3x = 10 5 = 5 R = 5x = 10 x2 = 100 3x = 30 5 = 5 R = 135x = 100 x2 = 10000 3x = 300 5 = 5 R = 10,305x = 1000 x2 = 1000000 3x = 3000 5 = 5 R = 1,003,005x = 10,000 R = 100,030,005x = 100,000 R = 10,000,300,005 Courtesy : Dale Roberts
  10. 10. Classifying Algorithms based on Big-OhA function f(n) is said to be of at most logarithmic growth if f(n) =O(log n)A function f(n) is said to be of at most quadratic growth if f(n) =O(n2)A function f(n) is said to be of at most polynomial growth if f(n) =O(nk), for some natural number k > 1A function f(n) is said to be of at most exponential growth if there isa constant c, such that f(n) = O(cn), and c > 1A function f(n) is said to be of at most factorial growth if f(n) = O(n!).A function f(n) is said to have constant running time if the size ofthe input n has no effect on the running time of the algorithm (e.g.,assignment of a value to a variable). The equation for this algorithmis f(n) = cOther logarithmic classifications: f(n) = O(n log n) f(n) = O(log log n) Courtesy : Dale Roberts
  11. 11. Rules for Calculating Big-OhBase of Logs ignored logan = O(logbn)Power inside logs ignored log(n2) = O(log n)Base and powers in exponents not ignored 3n is not O(2n) 2 a(n ) is not O(an)If T(x) is a polynomial of degree n, then T(x) = O(xn) Courtesy : Dale Roberts
  12. 12. Big-Oh Examples2n3 + 3n2 + n = = 2n3 + 3n2 + O(n) = 2n3 + O( n2 + n) = 2n3 + O( n2 ) = O(n3 ) = O(n4) Courtesy : Dale Roberts
  13. 13. Big-Oh Examples (cont.)3. Suppose a program P is O(n3), and a program Q is O(3n), and that currently both can solve proble ms of size 50 in 1 hour. If the programs are run o n another system that executes exactly 729 time s as fast as the original system, what size proble ms will they be able to solve in one hour? Courtesy : Dale Roberts
  14. 14. Big-Oh Examples (cont)n3 = 503 ∗ 729 3n = 350 ∗ 729n = 3 503 * 3 729 n = log3 (729 ∗ 350) n = log3(729) + log3 350n = 50 ∗ 9 n = 6 + log3 350n = 50 ∗ 9 = 450 n = 6 + 50 = 56 Improvement: problem size increased by 9 times for n3 algorithm but only a slight improvement in problem size (+6) for exponential algorithm. Courtesy : Dale Roberts
  15. 15. AcknowledgementsPhiladephia University, JordanNilagupta, Pradondet Courtesy : Dale Roberts

×