Upcoming SlideShare
×

# Algorithm.ppt

17,943 views

Published on

1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total views
17,943
On SlideShare
0
From Embeds
0
Number of Embeds
14
Actions
Shares
0
453
0
Likes
1
Embeds 0
No embeds

No notes for slide

### Algorithm.ppt

1. 1. Divide-and-Conquer <ul><li>Divide the problem into a number of subproblems. </li></ul><ul><li>Conquer the subproblems by solving them recursively. If the subproblem sizes are small enough, solve the subproblems in a straightforward manner. </li></ul>
2. 2. Divide-and-Conquer <ul><li>Combine the solutions to the subproblems into the solution for the original problem. </li></ul>
3. 3. Merge Sort Algorithm Divide: Divide the n-element sequence into two subsequences of n/2 elements each. Conquer: Sort the two subsequences recursively using merge sort. Combine: Merge the two sorted sequences.
4. 4. How to merge two sorted sequences <ul><li>We have two subarrays A[p..q] and A[q+1..r] in sorted order. </li></ul><ul><li>Merge sort algorithm merges them to form a single sorted subarray that replaces the current subarray A[p..r] </li></ul>
5. 5. Merging Algorithm <ul><li>Merge( A, p, q, r) </li></ul><ul><li>n1  q – p + 1 </li></ul><ul><li>n2  r – q </li></ul><ul><li>Create arrays L[1..n1+1] and R[1..n2+1] </li></ul>
6. 6. Merging Algorithm <ul><li>4. For i  1 to n1 </li></ul><ul><li>do L[i]  A[p + i -1] </li></ul><ul><li>6. For j  1 to n2 </li></ul><ul><li>do R[j]  A[q + j] </li></ul>
7. 7. Merging Algorithm 8. L[n1 + 1]  ∞ 9. R[n2 + 1]  ∞ 10. i  1 11. j  1
8. 8. Merging Algorithm <ul><li>12. For k  p to r </li></ul><ul><li>13. Do if L[i] <= R[j] </li></ul><ul><li>14. Then A[k]  L[i] </li></ul><ul><li>i=i+1 </li></ul><ul><li>else A[k]  R[j] </li></ul><ul><li>j=j+1 </li></ul>
9. 9. Merging Algorithm (a) … 6 3 2 1 7 5 4 2 A… 12 11 10 k 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 i 1 5 ∞ 6 3 2 1 R 4 3 2 j 1 5
10. 10. Merging Algorithm (b) … 6 3 2 1 7 5 4 1 A… 12 11 k 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 i 1 5 ∞ 6 3 2 1 R 4 3 j 2 1 5
11. 11. Merging Algorithm (c) … 6 3 2 1 7 5 2 1 A… 12 k 11 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R 4 3 j 2 1 5
12. 12. Merging Algorithm (d) … 6 3 2 1 7 2 2 1 A… k 12 11 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R 4 j 3 2 1 5
13. 13. Merging Algorithm (e) … 6 3 2 1 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
14. 14. Merging Algorithm (f) … 6 3 2 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 i 3 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
15. 15. Merging Algorithm (g) … 6 3 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L i 4 3 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
16. 16. Merging Algorithm (h) … 6 6 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L i 4 3 2 1 5 ∞ 6 3 2 1 R 4 3 2 1 j 5
17. 17. Merging Algorithm (i) … 7 6 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 1 i 5 ∞ 6 3 2 1 R 4 3 2 1 j 5
18. 18. Merge Sort Algorithm <ul><li>Merge-Sort( A, p, r) </li></ul><ul><li>If p<r </li></ul><ul><li>then q  (p+r)/2 </li></ul><ul><li>Merge-Sort( A, p, q) </li></ul><ul><li>Merge-Sort( A, q+1, r) </li></ul><ul><li>Merge( A, p, q, r) </li></ul>
19. 19. Operation of Merge Sort 6 1 3 7 5 4 2 2 7 5 4 2 6 3 2 1 5 2 7 4 3 1 6 2 5 2 4 7 1 3 2 6
20. 20. Analyzing Divide-and-Conquer Algorithm When an algorithm contains a recursive call to itself, its running time can be described by a recurrence equation or recurrence which describes the running time
21. 21. Recurrence If the problem size is small enough, say n<=c for some constant c, the straightforward solution takes constant time, can be written as θ (1).
22. 22. Recurrence <ul><li>If we have </li></ul><ul><li>a subproblems, each of which is 1/b the size of the original. </li></ul><ul><li>D(n) time to divide the problem </li></ul>
23. 23. Recurrence <ul><li>C(n) time to combine the solution </li></ul><ul><li>The recurrence </li></ul><ul><li>T(n)= </li></ul>θ (1) if n <= c aT(n/b) + D(n) + C(n) otherwise
24. 24. Recurrence Divide: The divide step computes the middle of the subarray which takes constant time, D(n)= θ (1)
25. 25. Recurrence Conquer: We recursively solve two subproblems, each of size n/2, which contributes 2T(n/2) to the running time.
26. 26. Recurrence Combine: Merge procedure takes θ (n) time on an n-element subarray. C(n)= θ (n) The recurrence T(n)= θ (1) if n=1 2T(n/2) + θ (n) if n>1
27. 27. Recurrence Let us rewrite the recurrence T(n)= C represents the time required to solve problems of size 1 C if n=1 2T(n/2) + cn if n>1
28. 28. A Recursion Tree for the Recurrence T(n) Cn T(n/2) T(n/2)
29. 29. A Recursion Tree for the Recurrence Cn Cn/2 Cn/2 T(n/4) T(n/4) T(n/4) T(n/4)
30. 30. A Recursion Tree for the Recurrence C(n) Cn/2 Cn/2 Cn/4 Cn/4 Cn/4 Cn/4 C C C C C C C cn cn cn cn lg n
31. 31. Total Running Time The fully expanded tree has lg n +1 levels and each level contributes a total cost of cn. Therefore T(n)= cn lg n + cn = θ (nlg n)
32. 32. Growth of Functions We look at input sizes large enough to make only the order of growth of the running time relevant.
33. 33. Asymptotic Notation Used to describe running time of an algorithm and defined in terms of functions whose domains are the set of natural numbers N={0,1,2--------------}
34. 34. θ - Notation θ (g(n)) = { f(n) : there exist positive constants C 1 , C 2 , and n 0 such that 0 <= C 1 g(n)<=f(n) <=C 2 g(n) for all n>=n 0 }
35. 35. θ - Notation n 0 n C 2 g(n) C 1 g(n) f(n) f(n)= θ (g(n))
36. 36. θ - Notation For all n>=n 0 , the function f(n) is equal to g(n) to within a constant factor. So g(n) is asymptotically tight bound for f(n).
37. 37. θ - Notation Let us show that 1/2n 2 - 3n= θ (n 2 ) To do so, we must determine C 1 , C 2 , and n 0 such that C 1 n 2 <=1/2n 2 -3n<=C 2 n 2 for all n>=n 0
38. 38. θ - Notation Diving by n 2 C 1 <=1/2 – 3/n <= C 2 By choosing C 1 =1/14 , C 2 =1/2, and n 0 =7, we can verify that 1/2n 2 - 3n= θ (n 2 )
39. 39. O - Notation O (g(n)) = { f(n) : there exist positive constants C and n 0 such that 0 <= f(n) <=Cg(n) for all n>=n 0 }
40. 40. O - Notation n 0 n Cg(n) f(n) f(n)= O (g(n))
41. 41. O - Notation O(n 2 ) bound on worst-case running time of insertion sort also applies to its running time on every input.
42. 42. O - Notation θ (n 2 ) bound on worst-case running time of insertion sort, however, does not imply a θ (n 2 ) bound on the running time of insertion sort on every input.
43. 43. Ω - Notation Ω (g(n)) = { f(n) : there exist positive constants C and n 0 such that 0 <= Cg(n)<= f(n) for all n>=n 0 }
44. 44. Ω - Notation n 0 n Cg(n) f(n) f(n)= Ω(g(n))
45. 45. Ω - Notation Since Ω-notation describes a lower bound, we use it to bound the best-case running time of an algorithm, we also bound the running time of algorithm on arbitrary inputs
46. 46. o - Notation We use o- notation to denote a upper bound that is not asymptotically tight. The bound 2n 2 = O (n 2 ) is asymptotically tight, but the bound 2n= O (n 2 ) is not.
47. 47. ω - Notation We use ω - notation to denote a lower bound that is not asymptotically tight.