This document discusses various sorting algorithms and their time complexities. It covers common sorting algorithms like bubble sort, selection sort, insertion sort, which have O(N^2) time complexity and are slow for large data sets. More efficient algorithms like merge sort, quicksort, heapsort with O(N log N) time complexity are also discussed. Implementation details and examples are provided for selection sort, insertion sort, merge sort and quicksort algorithms.
2. Motivation of Sorting
• The term list here is a collection of records.
• Each record has one or more fields.
• Each record has a key to distinguish one record with
another.
• For example, the phone directory is a list. Name,
phone number, and even address can be the key,
depending on the application or need.
3. Two Common Categories
Sorting Algorithms of
O(N^2)
• Bubble Sort
• Selection Sort
• Insertion Sort
Sorting Algorithms of
O(N log N)
• Heap Sort
• Merge Sort
• Quick Sort
4. For small values of N
• It is important to note that all algorithms
appear to run equally as fast for small values
of N.
• For values of N from the thousands to the
millions, The differences between O(N^2)
and O(N log N) become dramatically
apparent
5. O(N^2) Sorts
• Easy to program
• Simple to understand
• Very slow, especially for large values of N
• Almost never used in professional software
6. Selection Sort
• More efficient than Bubble Sort.
• Works by finding the largest element in the
list and swapping it with the last element,
effectively reducing the size of the list by 1.
7. Selection Sort Algorithm
void SelectionSort()
{
for (int i = 0; i < n-1; i++)
{
int indx=i, small = a[i];
for (int j = i+1; j < n; j++ )
{
if (a[j] < small)
{
small = a[j];
indx = j;
}
}
a[indx] = a[i];
a[i] = small;
}
}
8. Insertion Sort
• while some elements unsorted:
– Using linear search, find the location in the sorted portion
where the 1st element of the unsorted portion should be
inserted
– Move all the elements after the insertion location up one
position to make space for the new element
13 21
45 79 47 22
38 74 36
66 94 29
57 81
60 16
45
66
60
45
the fourth iteration of this loop is shown here
11. Insertion Sort Algorithm
void insertionSort()
{
for (int i = 1; i < n; i++)
{
int pos;
int y = a[i];
// Shuffle up all sorted items > arr[i]
for (pos = i-1; pos >=0 && y < a[pos]; pos--)
a[pos+1] = a[pos];
// Insert the current item
a[pos+1] = y;
}
}
12. O(N log N) Sorts
• Fast
• Efficient
• Complicated, not easy to understand
• Most make extensive use of recursion and
complex data structures
14. 14
Divide and Conquer
1. Base Case, solve the problem directly if it is
small enough
2. Divide the problem into two or more similar
and smaller subproblems
3. Recursively solve the subproblems
4. Combine solutions to the subproblems
15. 15
Divide and Conquer - Sort
Problem:
• Input: A[left..right] – unsorted array of integers
• Output: A[left..right] – sorted in non-decreasing order
16. 16
Divide and Conquer - Sort
1. Base case
at most one element (left ≥ right), return
2. Divide A into two subarrays: FirstPart, SecondPart
Two Subproblems:
sort the FirstPart
sort the SecondPart
3. Recursively
sort FirstPart
sort SecondPart
4. Combine sorted FirstPart and sorted SecondPart
52. 52
Quick Sort
• Divide:
• Pick any element p as the pivot, e.g, the first element
• Partition the remaining elements into
FirstPart, which contains all elements < p
SecondPart, which contains all elements ≥ p
• Recursively sort the FirstPart and SecondPart
• Combine: no work is necessary since sorting
is done in place
53. 53
Quick Sort
x < p p p ≤ x
Partition
FirstPart SecondPart
p
pivot
A:
Recursive call
x < p p p ≤ x
Sorted
FirstPart
Sorted
SecondPart
Sorted
54. 54
Quick Sort
Quick-Sort(A, left, right)
if left ≥ right return
else
middle = Partition(A, left, right)
Quick-Sort(A, left, middle–1 )
Quick-Sort(A, middle+1, right)
end if
68. 68
A: 3 6 7 8
1 5
4
2
x < 4 4 ≤ x
pivot in
correct position
Partition Example
69. 69
Partition(A, left, right)
1. x = A[left]
2. i = left
3. for j = left+1 to right
4. {
5. if (A[j] < x){
6. i = i + 1
7. swap(A[i], A[j])
8. }
9. }
10. swap(A[i], A[left])
11. return i
83. Radix Sort
Extra information: every integer can be
represented by at most k digits
– d1d2…dk where di are digits in base r
– d1: most significant digit
– dk: least significant digit
84. Radix Sort
• Algorithm
– sort by the least significant digit first (counting
sort)
=> Numbers with the same digit go to same bin
– reorder all the numbers: the numbers in bin 0
precede the numbers in bin 1, which precede the
numbers in bin 2, and so on
– sort by the next least significant digit
– continue this process until the numbers have been
sorted on all k digits
85. Radix Sort
• Assume each key has a link field. Then the keys in the
same bin are linked together into a chain:
– f[i], 0 ≤ i ≤ r (the pointer to the first record in bin i)
– e[i], (the pointer to the end record in bin i)
– The chain will operate as a queue.
89. Heap Sort
• Slowest O(N log N) algorithm.
• Although the slowest of the O(N log N)
algorithms, it has less memory demands than
Merge and Quick sort.
90. Heap Sort
Works by transferring items to a heap, which
is basically a binary tree in which all parent
nodes have greater values than their child
nodes. The root of the tree, which is the
largest item, is transferred to a new array and
then the heap is reformed. The process is
repeated until the sort is complete.