Upcoming SlideShare
×
Like this presentation? Why not share!

Like this? Share it with your network

Share

Algorithm.ppt

• 9,110 views

• Comment goes here.
Are you sure you want to
Be the first to comment
Be the first to like this

Total Views
9,110
On Slideshare
9,107
From Embeds
3
Number of Embeds
2

Shares
197
0
Likes
0

Embeds3

 http://www.slideshare.net 2 http://coderwall.com 1

Report content

No notes for slide

Transcript

• 1. Divide-and-Conquer
• Divide the problem into a number of subproblems.
• Conquer the subproblems by solving them recursively. If the subproblem sizes are small enough, solve the subproblems in a straightforward manner.
• 2. Divide-and-Conquer
• Combine the solutions to the subproblems into the solution for the original problem.
• 3. Merge Sort Algorithm Divide: Divide the n-element sequence into two subsequences of n/2 elements each. Conquer: Sort the two subsequences recursively using merge sort. Combine: Merge the two sorted sequences.
• 4. How to merge two sorted sequences
• We have two subarrays A[p..q] and A[q+1..r] in sorted order.
• Merge sort algorithm merges them to form a single sorted subarray that replaces the current subarray A[p..r]
• 5. Merging Algorithm
• Merge( A, p, q, r)
• n1  q – p + 1
• n2  r – q
• Create arrays L[1..n1+1] and R[1..n2+1]
• 6. Merging Algorithm
• 4. For i  1 to n1
• do L[i]  A[p + i -1]
• 6. For j  1 to n2
• do R[j]  A[q + j]
• 7. Merging Algorithm 8. L[n1 + 1]  ∞ 9. R[n2 + 1]  ∞ 10. i  1 11. j  1
• 8. Merging Algorithm
• 12. For k  p to r
• 13. Do if L[i] <= R[j]
• 14. Then A[k]  L[i]
• i=i+1
• else A[k]  R[j]
• j=j+1
• 9. Merging Algorithm (a) … 6 3 2 1 7 5 4 2 A… 12 11 10 k 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 i 1 5 ∞ 6 3 2 1 R 4 3 2 j 1 5
• 10. Merging Algorithm (b) … 6 3 2 1 7 5 4 1 A… 12 11 k 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 i 1 5 ∞ 6 3 2 1 R 4 3 j 2 1 5
• 11. Merging Algorithm (c) … 6 3 2 1 7 5 2 1 A… 12 k 11 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R 4 3 j 2 1 5
• 12. Merging Algorithm (d) … 6 3 2 1 7 2 2 1 A… k 12 11 10 9 8 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R 4 j 3 2 1 5
• 13. Merging Algorithm (e) … 6 3 2 1 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 i 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
• 14. Merging Algorithm (f) … 6 3 2 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 i 3 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
• 15. Merging Algorithm (g) … 6 3 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L i 4 3 2 1 5 ∞ 6 3 2 1 R j 4 3 2 1 5
• 16. Merging Algorithm (h) … 6 6 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L i 4 3 2 1 5 ∞ 6 3 2 1 R 4 3 2 1 j 5
• 17. Merging Algorithm (i) … 7 6 5 4 3 2 2 1 A… 12 11 10 9 8 k 17 16 15 14 13 ∞ 7 5 4 2 L 4 3 2 1 i 5 ∞ 6 3 2 1 R 4 3 2 1 j 5
• 18. Merge Sort Algorithm
• Merge-Sort( A, p, r)
• If p<r
• then q  (p+r)/2
• Merge-Sort( A, p, q)
• Merge-Sort( A, q+1, r)
• Merge( A, p, q, r)
• 19. Operation of Merge Sort 6 1 3 7 5 4 2 2 7 5 4 2 6 3 2 1 5 2 7 4 3 1 6 2 5 2 4 7 1 3 2 6
• 20. Analyzing Divide-and-Conquer Algorithm When an algorithm contains a recursive call to itself, its running time can be described by a recurrence equation or recurrence which describes the running time
• 21. Recurrence If the problem size is small enough, say n<=c for some constant c, the straightforward solution takes constant time, can be written as θ (1).
• 22. Recurrence
• If we have
• a subproblems, each of which is 1/b the size of the original.
• D(n) time to divide the problem
• 23. Recurrence
• C(n) time to combine the solution
• The recurrence
• T(n)=
θ (1) if n <= c aT(n/b) + D(n) + C(n) otherwise
• 24. Recurrence Divide: The divide step computes the middle of the subarray which takes constant time, D(n)= θ (1)
• 25. Recurrence Conquer: We recursively solve two subproblems, each of size n/2, which contributes 2T(n/2) to the running time.
• 26. Recurrence Combine: Merge procedure takes θ (n) time on an n-element subarray. C(n)= θ (n) The recurrence T(n)= θ (1) if n=1 2T(n/2) + θ (n) if n>1
• 27. Recurrence Let us rewrite the recurrence T(n)= C represents the time required to solve problems of size 1 C if n=1 2T(n/2) + cn if n>1
• 28. A Recursion Tree for the Recurrence T(n) Cn T(n/2) T(n/2)
• 29. A Recursion Tree for the Recurrence Cn Cn/2 Cn/2 T(n/4) T(n/4) T(n/4) T(n/4)
• 30. A Recursion Tree for the Recurrence C(n) Cn/2 Cn/2 Cn/4 Cn/4 Cn/4 Cn/4 C C C C C C C cn cn cn cn lg n
• 31. Total Running Time The fully expanded tree has lg n +1 levels and each level contributes a total cost of cn. Therefore T(n)= cn lg n + cn = θ (nlg n)
• 32. Growth of Functions We look at input sizes large enough to make only the order of growth of the running time relevant.
• 33. Asymptotic Notation Used to describe running time of an algorithm and defined in terms of functions whose domains are the set of natural numbers N={0,1,2--------------}
• 34. θ - Notation θ (g(n)) = { f(n) : there exist positive constants C 1 , C 2 , and n 0 such that 0 <= C 1 g(n)<=f(n) <=C 2 g(n) for all n>=n 0 }
• 35. θ - Notation n 0 n C 2 g(n) C 1 g(n) f(n) f(n)= θ (g(n))
• 36. θ - Notation For all n>=n 0 , the function f(n) is equal to g(n) to within a constant factor. So g(n) is asymptotically tight bound for f(n).
• 37. θ - Notation Let us show that 1/2n 2 - 3n= θ (n 2 ) To do so, we must determine C 1 , C 2 , and n 0 such that C 1 n 2 <=1/2n 2 -3n<=C 2 n 2 for all n>=n 0
• 38. θ - Notation Diving by n 2 C 1 <=1/2 – 3/n <= C 2 By choosing C 1 =1/14 , C 2 =1/2, and n 0 =7, we can verify that 1/2n 2 - 3n= θ (n 2 )
• 39. O - Notation O (g(n)) = { f(n) : there exist positive constants C and n 0 such that 0 <= f(n) <=Cg(n) for all n>=n 0 }
• 40. O - Notation n 0 n Cg(n) f(n) f(n)= O (g(n))
• 41. O - Notation O(n 2 ) bound on worst-case running time of insertion sort also applies to its running time on every input.
• 42. O - Notation θ (n 2 ) bound on worst-case running time of insertion sort, however, does not imply a θ (n 2 ) bound on the running time of insertion sort on every input.
• 43. Ω - Notation Ω (g(n)) = { f(n) : there exist positive constants C and n 0 such that 0 <= Cg(n)<= f(n) for all n>=n 0 }
• 44. Ω - Notation n 0 n Cg(n) f(n) f(n)= Ω(g(n))
• 45. Ω - Notation Since Ω-notation describes a lower bound, we use it to bound the best-case running time of an algorithm, we also bound the running time of algorithm on arbitrary inputs
• 46. o - Notation We use o- notation to denote a upper bound that is not asymptotically tight. The bound 2n 2 = O (n 2 ) is asymptotically tight, but the bound 2n= O (n 2 ) is not.
• 47. ω - Notation We use ω - notation to denote a lower bound that is not asymptotically tight.