1. Quicksort and its
Time Complexity
Derivations
DONE BY:-
DEEPAK.M
1VE21CA010
BANGALORE
Design and Analysis of Algorithms – 21CS42
2. Learning Goals
Basic Idea of Quick Sort
Partitioning and Sorting
Example
Analysis of Best and Average Case
Contents
3. Learning Goals
•After Learning This
• Be able to explain how quicksort works
and what its complexity is (worst-case,
best-case, average case).
• Be able to compare quicksort with
heapsort and mergesort.
• Be able to explain the advantages and
disadvantages of quicksort, and its
applications.
4. Introduction
• The basic version of quick sort algorithm was invented
by C. A. R. Hoare in 1960 and formally introduced quick
sort in 1962.
• It is used on the principle of divide-and-conquer.
• Quick sort is an algorithm of choice in many
situations because it is not difficult to implement, it
is a good "general purpose" sort.
• It consumes relatively fewer resources during
execution.
5. Quicksort by Hoare (1962)
• Select a pivot (partitioning element)
• Rearrange the list so that all the elements in the positions before
the pivot are smaller than or equal to the pivot and those after the
pivot are larger than or equal to the pivot
• Exchange the pivot with the last element in the first (i.e., ≤)
sublist – the pivot is now in its final position
• Sort the two sublists recursively
• Apply quicksort to sort the list 7 2 9 10 5 4
p
A[i]≤p A[i]p
6. Features
• Similar to mergesort - divide-and-conquer recursive algorithm
• One of the fastest sorting algorithms
• Average running time O(NlogN)
• Worst-case running time O(N2)
7. Good Points
• It is in-place since it uses only a small auxiliary stack.
• It requires only n log(n) time to sort n items.
• It has an extremely short inner loop
8. Bad Points
• It is recursive. Especially if recursion is not available,
the implementation is extremely complicated.
• It requires quadratic (i.e., n2) time in the worst-case.
• It is fragile i.e., a simple mistake in the
implementation can go unnoticed and cause it to
perform badly.
9. Description of Quick Sort
• Quick sort works by partitioning a given array
A[p . . r] into two non- empty sub array A[p . . q]
and A[q+1 . . r] such that every key in A[p . . q] is
less than or equal to every key in A[q+1 . . r].
• Then the two subarrays are sorted by recursive
calls to Quick sort.
• The exact position of the partition
depends on the given array and index q is
computed as a part of the partitioning
procedure.
10. Quick Sort Design
• Follows the divide-and-conquer paradigm.
• Divide: Partition (separate) the array A[p..r] into two
(possibly empty) subarrays A[p..q–1] and A[q+1..r].
• Each element in A[p..q–1] < A[q].
• A[q] < each element in A[q+1..r].
• Index q is computed as part of the partitioning procedure.
• Conquer: Sort the two subarrays by recursive calls to
quicksort.
• Combine: The subarrays are sorted in place – no
work is needed to combine them.
• How do the divide and combine steps of quicksort
compare with those of merge sort?
11. Quick Sort:
• Like mergesort, Quicksort is also based on the divide-and-
conquer
paradigm.
• But it uses this technique in a somewhat
opposite manner, as all the hard work is
done before the recursive calls.
• It works as follows:
1. First, it partitions an array into two parts,
2. Then, it sorts the parts independently,
3. Finally, it combines the sorted subsequences by
a simple concatenation.
12. Quick Sort (Cont.…)
The quick-sort algorithm consists of the following three
steps:
1. Divide: Partition the list.
• To partition the list, we first choose some element from the list
for which we hope
about half the elements will come before and half after. Call this
element the pivot.
• Then we partition the elements so that all those with values
less than the pivot come in one sublist and all those with greater
values come in another.
1. Recursion: Recursively sort the sublists separately.
2. Conquer: Put the sorted sublists together.
13. Quick Sort Procedure
If p < r then
q PARTITION (A, p, r)
Recursive call to QUICKSORT(A, p, q)
Recursive call to QUICKSORT(A, q + r
, r)
14. Partition – Choosing the pivot
• First, we have to select a pivot element among the elements
of the given array, and we put this pivot into the first
location of the array before partitioning.
• Which array item should be selected as pivot?
• Somehow we have to select a pivot, and we hope that we
will get a good partitioning.
• If the items in the array arranged randomly, we choose a pivot
randomly.
• We can choose the first or last element as a pivot (it may
not give a good partitioning).
• We can use different techniques to select the pivot.
15. Partitioning The Array
x ← A[p]
i ← p-1
j ← r+1
while TRUE do
Repeat j ← j-1
until A[j] ≤ x
Repeat i ← i+1
until A[i] ≥ x
if i < j
then exchange A[i] ↔ A[j]
else return j
5
A[p..r]
Partitioning procedure rearranges the subarrays in-place.
PARTITION (A, p, r)
Partition
A[p..q – 1] A[q+1..r]
5
5 5
16. How to
Place
• Partition selects the first key, A[p] as a pivot key about
which the array will partitioned:
• Keys ≤ A[p] will be moved towards the
left .
• Keys ≥ A[p] will be moved towards the
right.
• The running time of the partition procedure is Θ(n)
where n = r – p + 1 which is the number of keys in the
array.
17. Example
r
p
2 5 8 3 9 4 1 7 10 6
i j
initially: note: pivot (x) = 6
next iteration: 2 5
i j
next iteration: 2 5
i j
next iteration: 2 5
i j
next iteration: 2 5
i
8 3 9 4 1 7 10 6
8 3 9 4 1 7 10 6
8 3 9 4 1 7 10 6
3
j
8 9 4 1 7 10 6
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
18. Example (Continued)
after final swap: 2 5 3 4 1 6 9 7 10 8
i j
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
next iteration: 2 5 3 4 9 8 1 7 10 6 Partition(A, p, r)
i j x, i := A[r], p – 1;
next iteration: 2 5 3
i
4 1 8
j
9 7 10 6 for j := p to r – 1 do
if A[j] x then
next iteration:
next iteration:
2
2
5
5
3
i
3
4
4
1
1
8
8
9
j
9
7
7
10
10
6
6
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
i j return i + 1
19. Partitioning
• Select the last element A[r] in the subarray
A[p..r] as the pivot – the element around
which to partition.
• As the procedure executes, the array is
partitioned into four (possibly empty) regions.
1. A[p..i ] — All entries in this region are < pivot.
2. A[i+1..j – 1] — All entries in this region are > pivot.
3. A[r] = pivot.
4. A[j..r – 1] — Not known how they compare to pivot.
• The above hold before each iteration of the for
loop, and constitute a
loop invariant. (4 is not part of the loop i.)
20. Complexity of Partition
• PartitionTime(n) is given by the number of iterations in
the for loop.
• (n) : n = r – p + 1.
Partition(A, p, r)
x, i := A[r], p – 1;
for j := p to r – 1 do
if A[j] x then
i := i + 1;
A[i] A[j]
A[i + 1] A[r];
return i + 1
21. Quicksort Overview
To sort a[left...right]:
1. if left < right:
1. Partition a[left...right] such
that:
all a[left...p-1] are less
than a[p], and all a[p+1...right]
are >= a[p]
2. Quicksort a[left...p-1]
3. Quicksort a[p+1...right]
1. Terminate
22. Partitioning in
Quicksort
• A key step in the Quicksort algorithm is partitioning the array
• We choose some (any) number p in the array to use as a pivot
• We partition the array into three parts:
p
numbers less
than p
numbers greater than or
equal to p
p
23. Alternative Partitioning
• Choose an array value (say, the first) to use as
the pivot
• Starting from the left end, find the first
element that is greater than or equal to the
pivot
• Searching backward from the right end, find
the first element that is less than the pivot
• Interchange (swap) these two elements
• Repeat, searching from where we left off, until
done
24. Alternative Partitioning
To partition a[left...right]:
1. Set pivot = a[left], l = left + 1, r = right;
2. while l < r
, do
1. while l < right & a[l] < pivot , set l = l + 1
2. while r > left & a[r] >= pivot , set r = r - 1
3. if l < r, swap a[l] and a[r]
3. Set a[left] = a[r], a[r] = pivot
4. Terminate
26. Analysis of quicksort—best case
• Suppose each partition operation divides the array almost
exactly in half
• Then the depth of the recursion in log2n
• Because that’s how many times we can have n
• We note that
• Each partition is linear over its subarray
• All the partitions at one level cover the array
27. Best Case Analysis
• The best thing that could happen in Quick sort would
be that each partitioning stage divides the array
exactly in half. In other words, the best to be a median
of the keys in A[p . . r] every time procedure 'Partition'
is called. The procedure 'Partition' always split the
array to be sorted into two equal sized arrays.
• If the procedure 'Partition' produces two
regions of size n/2. the recurrence relation
is then
T(n) = T(n/2) + T(n/2) + Θ(n)
= 2T(n/2) + Θ(n)
• From case 2 of Master theorem
T(n) = Θ(n lg n)
28. Best Case Analysis
• We cut the array size in half each time
• So the depth of the recursion in log2n
• At each level of the recursion, all the partitions at
that level do work that is linear in n
• O(log2n) * O(n) = O(n log2n)
• Hence in the best case, quicksort has time
complexity O(n log2n)
• What about the worst case?
29. Worst Case Partitioning
• The worst-case occurs if given array A[1 . . n] is
already sorted. The PARTITION (A, p, r) call always
return p so successive calls to partition will split
arrays of length n, n-1, n-2, . . . , 2 and running time
proportional to n + (n-1) + (n-2) + . . . + 2 =
[(n+2)(n-1)]/2 = Θ (n2). The worst-case also occurs
if A[1 . . n] starts out in reverse order.
30. Worst case
• In the worst case, partitioning always
divides the size n array into these three
parts:
• A length one part, containing the pivot itself
• A length zero part, and
• A length n-1 part, containing everything else
• We don’t recur on the zero-length part
• Recurring on the length n-1 part requires (in
the worst case) recurring to depth n-1
31. Worst case for quicksort
• In the worst case, recursion may be n levels deep (for
an array of size n)
• But the partitioning work done at each level is still n
• O(n) * O(n) = O(n2)
• So worst case for Quicksort is O(n2)
• When does this happen?
• There are many arrangements that could make this happen
• Here are two common cases:
• When the array is already sorted
• When the array is inversely sorted (sorted in the opposite order)
32. Typical case for quicksort
• If the array is sorted to begin with, Quicksort
is terrible: O(n2)
• It is possible to construct other bad cases
• However, Quicksort is usually O(n log2n)
• The constants are so good that Quicksort is
generally the faster algorithm.
• Most real-world sorting is done by Quicksort
33. Picking a better pivot
• Before, we picked the first element of the subarray to
use as a pivot
• If the array is already sorted, this results in O(n2) behavior
• It’s no better if we pick the last element
• We could do an optimal quicksort (guaranteed O(n
log n)) if we always picked a pivot value that exactly
cuts the array in half
• Such a value is called a median: half of the values in the
array are larger, half are smaller
• The easiest way to find the median is to sort the array and
pick the value in the middle (!)
34. Median of three
• Obviously, it doesn’t make sense to sort the array
in order to find the median to use as a pivot.
• Instead, compare just three elements of our
(sub)array—the first, the last, and the middle
• Take the median (middle value) of these three as the pivot
• It’s possible (but not easy) to construct cases which will
make this technique O(n2)
35. Quicksort for
Small Arrays
• For very small arrays (N<= 20), quicksort does not
perform as well as insertion sort
• A good cutoff range is N=10
• Switching to insertion sort for small arrays can save
about 15% in the running time
36. Summary of Quicksort
• Best case: split in the middle — Θ( n log n)
• Worst case: sorted array! — Θ( n2)
• Average case: random arrays — Θ( n log n)
• Considered as the method of choice for internal sorting for
large files (n ≥10000)
Improvements:
• better pivot selection: median of three partitioning avoids worst
case in sorted files
• switch to insertion sort on small subfiles
• elimination of recursion
these combine to 20-25% improvement
37. Conclusion
•Quick sort is an in place
sorting algorithm whose
worst-case running time is
Θ(n2) and expected running
time is Θ(n lg n) where
constants hidden in Θ(n lg n)
are small.
38. Summary of Sorting Algorithms
Algorithm Time Notes
selection-sort O(n2)
in-place
slow (good for small inputs)
insertion-sort O(n2)
in-place
slow (good for small inputs)
quick-sort
O(n log n)
expected
in-place, randomized
fastest (good for large inputs)
heap-sort O(n log n)
in-place
fast (good for large inputs)
merge-sort O(n log n)
sequential data access
fast (good for huge inputs)
39. Characteristics
• Advantage
• One of the fastest algorithms on average.
• Does not need additional memory (the sorting takes
place in the array - this is called in-place processing).
Compare with mergesort: mergesort needs additional
memory for merging.
• Disadvantage
• The worst-case complexity is O(N2)
40. Applications
• Commercial applications use Quicksort - generally it
runs fast, no additional memory,
this compensates for the rare occasions when it runs with
O(N2)
• Never use in applications which require guaranteed response
time:
• Life-critical (medical monitoring, life support in aircraft and
space craft)
• Mission-critical (monitoring and control in industrial and
research plants handling dangerous materials, control for
aircraft, defense, etc)
• unless you assume the worst-case response time.