SORTING
Insertion sort, Merge sort
Insertion sort <ul><li>1) Initially p = 1 </li></ul><ul><li>2) Let the first p elements be sorted. </li></ul><ul><li>3) In...
Insertion Sort
Insertion Sort... <ul><li>Consists of N - 1 passes </li></ul><ul><li>For pass p = 1 through N - 1, ensures that the elemen...
Extended Example To sort the following numbers in increasing order: 34  8  64  51  32  21 P = 1;  Look at first element on...
P = 3;  tmp = 64; 34 < 64, so stop at 3 rd  position and set 3 rd  position = 64 After third pass:  8  34  64  51  32  21 ...
Analysis: worst-case running time <ul><li>Inner loop is executed p times, for each p=1..N </li></ul><ul><li>   Overall: 1...
Analysis <ul><li>The bound is tight   (N 2 ) </li></ul><ul><li>That is, there exists some input which actually uses   (N...
Analysis: best case <ul><li>The input is already sorted in increasing order </li></ul><ul><ul><li>When inserting A[p] into...
Mergesort <ul><li>Based on divide-and-conquer strategy </li></ul><ul><li>Divide the list into two smaller lists of about e...
Dividing <ul><li>If the input list is a linked list, dividing takes   (N)  time </li></ul><ul><ul><li>We scan the linked ...
Mergesort <ul><li>Divide-and-conquer strategy </li></ul><ul><ul><li>recursively mergesort the first half and the second ha...
see applet http://www.cosc.canterbury.ac.nz/people/mukundan/dsal/MSort.html
How to merge? <ul><li>Input: two sorted array A and B </li></ul><ul><li>Output: an output sorted array C </li></ul><ul><li...
Example: Merge
Example: Merge... <ul><li>Running time analysis:  </li></ul><ul><ul><li>Clearly,  merge  takes O(m1 + m2) where m1 and m2 ...
 
Analysis of mergesort <ul><li>Let T(N) denote the worst-case running time of mergesort to sort N numbers.  </li></ul><ul><...
Analysis: solving recurrence Since N=2 k , we have k=log 2  n
 
An experiment <ul><li>Code from textbook (using template) </li></ul><ul><li>Unix  time  utility </li></ul>
Heaps and heapsort
Motivating Example <ul><li>3 jobs have been submitted to a printer in the order A, B, C. </li></ul><ul><li>Sizes:  Job A –...
Common Implementations <ul><li>1) Linked list </li></ul><ul><ul><li>Insert in O(1) </li></ul></ul><ul><ul><li>Find the min...
Heaps <ul><li>Heaps are “almost complete binary trees” </li></ul><ul><li>-- All levels are full except possibly the lowest...
Binary Trees <ul><li>Has a  root  at the topmost level </li></ul><ul><li>Each node has  zero, one or two  children </li></...
Binary trees A complete tree An almost complete tree A complete binary tree is one where a node can have 0 (for the leaves...
Height (depth) of a binary tree <ul><li>The number of edges on the  longest  path from the root to a leaf. </li></ul>Heigh...
Height of a binary tree <ul><li>A complete binary tree of N nodes has height O(log N) </li></ul>d 2 d Total: 2 d+1  - 1
Proof Prove by induction that number of nodes at depth d is 2 d Total number of nodes of a complete binary tree of depth d...
Coming back to Heap Heap property: the value at each node is less than or equal to that of  its descendants. not a heap A ...
Heap <ul><li>Heap supports the following operations efficiently </li></ul><ul><ul><li>Locate the current minimum in O(1) t...
Insertion <ul><li>Add the new element to the lowest level </li></ul><ul><li>Restore the min-heap property if violated: </l...
Insert 2.5 Percolate  up to maintain the heap property 1 2 5 6 4 3 1 2 5 6 4 3 2.5 1 2 2.5 6 4 3 5
 
Insertion  A heap!
DeleteMin: first attempt Delete the root. Compare the two children of the root Make the lesser of the two the root. An emp...
Heap property is preserved, but completeness is not preserved! 1 2 5 6 4 3 2 5 6 4 3 2 5 6 4 3 2 3 5 6 4
DeleteMin <ul><li>Copy the last number to the root (i.e. overwrite the minimum element stored there) </li></ul><ul><li>Res...
Upcoming SlideShare
Loading in …5
×

Sorting

1,555 views

Published on

Data Structures Sorting

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,555
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
152
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Sorting

  1. 1. SORTING
  2. 2. Insertion sort, Merge sort
  3. 3. Insertion sort <ul><li>1) Initially p = 1 </li></ul><ul><li>2) Let the first p elements be sorted. </li></ul><ul><li>3) Insert the (p+1)th element properly in the list so that now p+1 elements are sorted. </li></ul><ul><li>4) increment p and go to step (3) </li></ul>
  4. 4. Insertion Sort
  5. 5. Insertion Sort... <ul><li>Consists of N - 1 passes </li></ul><ul><li>For pass p = 1 through N - 1, ensures that the elements in positions 0 through p are in sorted order </li></ul><ul><ul><li>elements in positions 0 through p - 1 are already sorted </li></ul></ul><ul><ul><li>move the element in position p left until its correct place is found among the first p + 1 elements </li></ul></ul>see applet http://www.cis.upenn.edu/~matuszek/cse121-2003/Applets/Chap03/Insertion/InsertSort.html
  6. 6. Extended Example To sort the following numbers in increasing order: 34 8 64 51 32 21 P = 1; Look at first element only, no change. P = 2; tmp = 8; 34 > tmp, so second element is set to 34. We have reached the front of the list. Thus, 1st position = tmp After second pass: 8 34 64 51 32 21 (first 2 elements are sorted)
  7. 7. P = 3; tmp = 64; 34 < 64, so stop at 3 rd position and set 3 rd position = 64 After third pass: 8 34 64 51 32 21 (first 3 elements are sorted) P = 4; tmp = 51; 51 < 64, so we have 8 34 64 64 32 21, 34 < 51, so stop at 2nd position, set 3 rd position = tmp, After fourth pass: 8 34 51 64 32 21 (first 4 elements are sorted) P = 5; tmp = 32, 32 < 64, so 8 34 51 64 64 21, 32 < 51, so 8 34 51 51 64 21, next 32 < 34, so 8 34 34, 51 64 21, next 32 > 8, so stop at 1st position and set 2 nd position = 32, After fifth pass: 8 32 34 51 64 21 P = 6; tmp = 21, . . . After sixth pass: 8 21 32 34 51 64
  8. 8. Analysis: worst-case running time <ul><li>Inner loop is executed p times, for each p=1..N </li></ul><ul><li> Overall: 1 + 2 + 3 + . . . + N = O(N 2 ) </li></ul><ul><li>Space requirement is O(N) </li></ul>
  9. 9. Analysis <ul><li>The bound is tight  (N 2 ) </li></ul><ul><li>That is, there exists some input which actually uses  (N 2 ) time </li></ul><ul><li>Consider input is a reverse sorted list </li></ul><ul><ul><li>When A[p] is inserted into the sorted A[0..p-1], we need to compare A[p] with all elements in A[0..p-1] and move each element one position to the right </li></ul></ul><ul><ul><li>  (i) steps </li></ul></ul><ul><ul><li>the total number of steps is  (  1 N-1 i) =  (N(N-1)/2) =  (N 2 ) </li></ul></ul>
  10. 10. Analysis: best case <ul><li>The input is already sorted in increasing order </li></ul><ul><ul><li>When inserting A[p] into the sorted A[0..p-1], only need to compare A[p] with A[p-1] and there is no data movement </li></ul></ul><ul><ul><li>For each iteration of the outer for-loop, the inner for-loop terminates after checking the loop condition once => O(N) time </li></ul></ul><ul><li>If input is nearly sorted , insertion sort runs fast </li></ul>
  11. 11. Mergesort <ul><li>Based on divide-and-conquer strategy </li></ul><ul><li>Divide the list into two smaller lists of about equal sizes </li></ul><ul><li>Sort each smaller list recursively </li></ul><ul><li>Merge the two sorted lists to get one sorted list </li></ul><ul><li>How do we divide the list? How much time needed? </li></ul><ul><li>How do we merge the two sorted lists? How much time needed? </li></ul>
  12. 12. Dividing <ul><li>If the input list is a linked list, dividing takes  (N) time </li></ul><ul><ul><li>We scan the linked list, stop at the  N/2  th entry and cut the link </li></ul></ul><ul><li>If the input list is an array A[0..N-1]: dividing takes O(1) time </li></ul><ul><ul><li>we can represent a sublist by two integers left and right : to divide A[left..Right], we compute center=(left+right)/2 and obtain A[left..Center] and A[center+1..Right] </li></ul></ul>
  13. 13. Mergesort <ul><li>Divide-and-conquer strategy </li></ul><ul><ul><li>recursively mergesort the first half and the second half </li></ul></ul><ul><ul><li>merge the two sorted halves together </li></ul></ul>
  14. 14. see applet http://www.cosc.canterbury.ac.nz/people/mukundan/dsal/MSort.html
  15. 15. How to merge? <ul><li>Input: two sorted array A and B </li></ul><ul><li>Output: an output sorted array C </li></ul><ul><li>Three counters: Actr, Bctr, and Cctr </li></ul><ul><ul><li>initially set to the beginning of their respective arrays </li></ul></ul><ul><li>(1)   The smaller of A[Actr] and B[Bctr] is copied to the next entry in C, and the appropriate counters are advanced </li></ul><ul><li>(2)   When either input list is exhausted, the remainder of the other list is copied to C </li></ul>
  16. 16. Example: Merge
  17. 17. Example: Merge... <ul><li>Running time analysis: </li></ul><ul><ul><li>Clearly, merge takes O(m1 + m2) where m1 and m2 are </li></ul></ul><ul><li>the sizes of the two sublists. </li></ul><ul><li>Space requirement: </li></ul><ul><ul><li>merging two sorted lists requires linear extra memory </li></ul></ul><ul><ul><li>additional work to copy to the temporary array and back </li></ul></ul>
  18. 19. Analysis of mergesort <ul><li>Let T(N) denote the worst-case running time of mergesort to sort N numbers. </li></ul><ul><li>Assume that N is a power of 2. </li></ul><ul><li>Divide step: O(1) time </li></ul><ul><li>Conquer step: 2 T(N/2) time </li></ul><ul><li>Combine step: O(N) time </li></ul><ul><li>Recurrence equation: </li></ul><ul><ul><li>T(1) = 1 </li></ul></ul><ul><ul><li>T(N) = 2T(N/2) + N </li></ul></ul>
  19. 20. Analysis: solving recurrence Since N=2 k , we have k=log 2 n
  20. 22. An experiment <ul><li>Code from textbook (using template) </li></ul><ul><li>Unix time utility </li></ul>
  21. 23. Heaps and heapsort
  22. 24. Motivating Example <ul><li>3 jobs have been submitted to a printer in the order A, B, C. </li></ul><ul><li>Sizes: Job A – 100 pages </li></ul><ul><li>Job B – 10 pages </li></ul><ul><li>Job C -- 1 page </li></ul><ul><li>Average waiting time with FIFO service: </li></ul><ul><li>(100+110+111) / 3 = 107 time units </li></ul><ul><li>Average waiting time for shortest-job-first service: </li></ul><ul><li>(1+11+111) / 3 = 41 time units </li></ul><ul><li>Need to have a queue which does insert and deletemin </li></ul><ul><li>Priority Queue </li></ul>
  23. 25. Common Implementations <ul><li>1) Linked list </li></ul><ul><ul><li>Insert in O(1) </li></ul></ul><ul><ul><li>Find the minimum element in O(n), </li></ul></ul><ul><ul><li>thus deletion is O(n) </li></ul></ul>2) Search Tree (AVL tree, to be covered later) Insert in O(log n) Delete in O(log n) Search tree is an overkill as it does many other operations
  24. 26. Heaps <ul><li>Heaps are “almost complete binary trees” </li></ul><ul><li>-- All levels are full except possibly the lowest level. </li></ul><ul><ul><li>If the lowest level is not full, then then nodes must be packed to the left. </li></ul></ul>Note: we define “complete tree” slightly different from the book.
  25. 27. Binary Trees <ul><li>Has a root at the topmost level </li></ul><ul><li>Each node has zero, one or two children </li></ul><ul><li>A node that has no child is called a leaf </li></ul><ul><li>For a node x, we denote the left child, right child and the parent of x as left(x), right(x) and parent(x), respectively. </li></ul>
  26. 28. Binary trees A complete tree An almost complete tree A complete binary tree is one where a node can have 0 (for the leaves) or 2 children and all leaves are at the same depth.
  27. 29. Height (depth) of a binary tree <ul><li>The number of edges on the longest path from the root to a leaf. </li></ul>Height = 4
  28. 30. Height of a binary tree <ul><li>A complete binary tree of N nodes has height O(log N) </li></ul>d 2 d Total: 2 d+1 - 1
  29. 31. Proof Prove by induction that number of nodes at depth d is 2 d Total number of nodes of a complete binary tree of depth d is 1 + 2 + 4 +…… 2 d = 2 d+1 - 1 Thus 2 d+1 - 1 = N d = log(N + 1) - 1 = O(log N) What is the largest depth of a binary tree of N nodes? N
  30. 32. Coming back to Heap Heap property: the value at each node is less than or equal to that of its descendants. not a heap A heap 4 2 5 6 1 3 1 2 5 6 4 3
  31. 33. Heap <ul><li>Heap supports the following operations efficiently </li></ul><ul><ul><li>Locate the current minimum in O(1) time </li></ul></ul><ul><ul><li>Delete the current minimum in O(log N) time </li></ul></ul>
  32. 34. Insertion <ul><li>Add the new element to the lowest level </li></ul><ul><li>Restore the min-heap property if violated: </li></ul><ul><ul><li>“ bubble up”: if the parent of the element is larger than the element, then interchange the parent and child. </li></ul></ul>
  33. 35. Insert 2.5 Percolate up to maintain the heap property 1 2 5 6 4 3 1 2 5 6 4 3 2.5 1 2 2.5 6 4 3 5
  34. 37. Insertion A heap!
  35. 38. DeleteMin: first attempt Delete the root. Compare the two children of the root Make the lesser of the two the root. An empty spot is created. Bring the lesser of the two children of the empty spot to the empty spot. A new empty spot is created. Continue
  36. 39. Heap property is preserved, but completeness is not preserved! 1 2 5 6 4 3 2 5 6 4 3 2 5 6 4 3 2 3 5 6 4
  37. 40. DeleteMin <ul><li>Copy the last number to the root (i.e. overwrite the minimum element stored there) </li></ul><ul><li>Restore the min-heap property by “bubble down” </li></ul>

×