SlideShare a Scribd company logo
1
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
MODULE-4
Objectives of Sorting:
The objective of the sorting algorithm is to rearrange the records so that their keys are
ordered according to some well-defined ordering rule.
Problem: Given an array of n real number A[1.. n].
Objective: Sort the elements of A in ascending order of their values.
Internal Sort
If the file to be sorted will fit into memory or equivalently if it will fit into an array, then the
sorting method is called internal. In this method, any record can be accessed easily.
External Sort
 Sorting files from tape or disk.
 In this method, an external sort algorithm must access records sequentially, or at least
in the block.
Memory Requirement
1. Sort in place and use no extra memory except perhaps for a small stack or table.
2. Algorithm that use a linked-list representation and so use N extra words of memory
for list pointers.
3. Algorithms that need enough extra memory space to hold another copy of the array to
be sorted.
Stability
A sorting algorithm is called stable if it is preserves the relative order of equal keys in the
file. Most of the simple algorithm are stable, but most of the well-known sophisticated
algorithms are not.
There are two classes of sorting algorithms namely, O(n2)-algorithms and O(n log n)-
algorithms. O(n2)-class includes bubble sort, insertion sort, selection sort and shell sort. O(n
log n)-class includes heap sort, merge sort and quick sort.
O(n2) Sorting Algorithms
2
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
O(n log n) Sorting Algorithms
SORTING PROPERTIES
Sorting Properties
Property Description
Adaptive A sort is adaptive if it runs faster on a partially sorted array.
Stable A sort is stable if it preserves the relative order of equal keys in the
database.
In Situ An in situ (“in place”) sort moves the items within the array itself and, thus,
requires only a small O(1) amount of extra storage.
Online An online sort can process its data piece-by-piece in serial fashion without
having the entire array available from the beginning of the algorithm.
3
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Properties Of Sorting Algorithms
Adaptive Stable In Situ Online
Linear Insertion Yes Yes Yes Yes
Mergesort No Yes No Yes
Quicksort No† No Yes No
†Quicksort actually runs more slowly on a partially sorted array.
Runtime Properties Of Sorting Algorithms
Linear Insertion o Average case Ο(n2)
o Worst-case Ο(n2)
o Runs in O(n) time on a sorted array
Mergesort o Average case Ο(n lg n)
o Worst-case Ο(n lg n)
o Runtime is not affected by the array contents, only the array size
Quicksort o Average case Ο(n lg n)
o Worst-case Ο(n2) on a sorted array
o Median-of-three partitioning guarantees Ο(n lg n) runtime
Choosing a Sorting Algorithm
To choose a sorting algorithm for a particular problem, consider the running time, space
complexity, and the expected format of the input list.
Algorithm Best-case Worst-case Average-case Space Complexity Stable?
Merge Sort O(n log
n)O(nlogn)
O(n log
n)O(nlogn)
O(n log
n)O(nlogn)
O(n)O(n) Yes
Insertion
Sort
O(n)O(n) O(n^2)O(n2) O(n^2)O(n2) O(1)O(1) Yes
4
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Bubble Sort O(n)O(n) O(n^2)O(n2) O(n^2)O(n2) O(1)O(1) Yes
Quicksort O(n log
n)O(nlogn)
O(n^2)O(n2) O(n log
n)O(nlogn)
log
nlogn best, nn avg
Usuall
y not*
Heapsort O(n log
n)O(nlogn)
O(n log
n)O(nlogn)
O(n log
n)O(nlogn)
O(1)O(1) No
Counting
Sort
O(k+n)O(k+n) O(k+n)O(k+n) O(k+n)O(k+n) O(k+n)O(k+n) Yes
*Most quicksort implementations are not stable, though stable implementations do exist.
When choosing a sorting algorithm to use, weigh these factors. For example, quicksort is a
very fast algorithm but can be pretty tricky to implement; bubble sort is a slow algorithm but
is very easy to implement. To sort small sets of data, bubble sort may be a better option since
it can be implemented quickly, but for larger datasets, the speedup from quicksort might be
worth the trouble implementing the algorithm.
1. Bubble sort
It is also called as exchange sort. In this sort the comparison of adjacent elements is done whenever
the 1st
element is greater than the 2nd
element. Then the swapping will be done. For any kind of sorting
if we have n number elements there will be n-1 iterations. The efficiency of bubble sort is O(n).
Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping the adjacent
elements if they are in wrong order.
Example:
First Pass:
( 5 1 4 2 8 ) –> ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps since 5 >
1.
( 1 5 4 2 8 ) –> ( 1 4 5 2 8 ), Swap since 5 > 4
( 1 4 5 2 8 ) –> ( 1 4 2 5 8 ), Swap since 5 > 2
( 1 4 2 5 8 ) –> ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does
not swap them.
Second Pass:
( 1 4 2 5 8 ) –> ( 1 4 2 5 8 )
( 1 4 2 5 8 ) –> ( 1 2 4 5 8 ), Swap since 4 > 2
5
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
Now, the array is already sorted, but our algorithm does not know if it is completed. The algorithm
needs one whole pass without any swap to know it is sorted.
Third Pass:
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
( 1 2 4 5 8 ) –> ( 1 2 4 5 8 )
// C program for implementation of Bubble sort
#include<stdio.h>
#include<conio.h>
int main()
{
int n,i,j,a[100],t;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
for(i=0;i<n;i++)
{
for(j=0;j<n-i-1;j++)
{
if(a[j]>a[j+1])
{
t=a[j];
a[j]=a[j+1];
a[j+1]=t;
}
}
}
printf("After performed the bubble sort the sorted array is : ");
for(i=0;i<n;i++)
printf("%dt",a[i]);
}
Worst and Average Case Time Complexity: O(n*n). Worst case occurs when array is reverse
sorted.
Best Case Time Complexity: O(n). Best case occurs when array is already sorted.
Auxiliary Space: O(1)
Boundary Cases: Bubble sort takes minimum time (Order of n) when elements are already sorted.
Sorting In Place: Yes
6
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Stable: Yes
Due to its simplicity, bubble sort is often used to introduce the concept of a sorting algorithm.
In computer graphics it is popular for its capability to detect a very small error (like swap of just
two elements) in almost-sorted arrays and fix it with just linear complexity (2n). For example, it is
used in a polygon filling algorithm, where bounding lines are sorted by their x coordinate at a
specific scan line (a line parallel to x axis) and with incrementing y their order changes (two
elements are swapped) only at intersections of two lines
Recurrence form of Bubble-Sort
T(n)=T(n-1) + n
2. Selection Sort:
 Selection sort is a simple sorting algorithm. This sorting algorithm is an in-place
comparison-based algorithm in which the list is divided into two parts, the sorted part
at the left end and the unsorted part at the right end. Initially, the sorted part is empty
and the unsorted part is the entire list.
 The smallest element is selected from the unsorted array and swapped with the
leftmost element, and that element becomes a part of the sorted array. This process
continues moving unsorted array boundary by one element to the right.
 This algorithm is not suitable for large data sets as its average and worst case
complexities are of Ο(n2), where n is the number of items.
Algorithm for Selection Sort:
Step 1 − Set min to the first location
Step 2 − Search the minimum element in the array
Step 3 – swap the first location with the minimum value in the array
Step 4 – assign the second element as min.
Step 5 − Repeat the process until we get a sorted array.
Source Code:
#include<stdio.h>
#include<conio.h>
int main()
{
int n,i,a[100],j,min,t;
printf("enter the number of elements: ");
scanf("%d",&n);
printf("enter the elements into an array: ");
7
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
for(i=0;i<n;i++)
scanf("%d",&a[i]);
for(i=0;i<n-1;i++)
{
min=i;
for(j=i+1;j<n;j++)
{
if(a[min]>a[j])
{
min=j;
}
}
t=a[i];
a[i]=a[min];
a[min]=t;
}
printf("after performed the selection sort,the sort list aren");
for(i=0;i<n;i++)
printf("%dn",a[i]);
return 0;
}
8
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
How Selection Sort Works?
Consider the following depicted array as an example.
For the first position in the sorted list, the whole list is scanned sequentially. The first
position where 14 is stored presently, we search the whole list and find that 10 is the lowest
value.
So we replace 14 with 10. After one iteration 10, which happens to be the minimum value in
the list, appears in the first position of the sorted list.
For the second position, where 33 is residing, we start scanning the rest of the list in a linear
manner.
We find that 14 is the second lowest value in the list and it should appear at the second
place. We swap these values.
After two iterations, two least values are positioned at the beginning in a sorted manner.
The same process is applied to the rest of the items in the array.Following is a pictorial
depiction of the entire sorting process
9
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
recurrence relation of selection sort
T(n)=T(n-1) + n-1
3. Insertion sort
Insertion sort is a simple sorting algorithm that works similar to the way you sort
playing cards in your hands. The array is virtually split into a sorted and an unsorted part.
Values from the unsorted part are picked and placed at the correct position in the sorted
part.
Example:
How Insertion Sort Works?
We take an unsorted array for our example.
Insertion sort compares the first two elements.
It finds that both 14 and 33 are already in ascending order. For now, 14 is in sorted sub-list.
10
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Insertion sort moves ahead and compares 33 with 27.
And finds that 33 is not in the correct position.
It swaps 33 with 27. It also checks with all the elements of sorted sub-list. Here we see that
the sorted sub-list has only one element 14, and 27 is greater than 14. Hence, the sorted sub-
list remains sorted after swapping.
By now we have 14 and 27 in the sorted sub-list. Next, it compares 33 with 10.
These values are not in a sorted order.
So we swap them.
However, swapping makes 27 and 10 unsorted.
Hence, we swap them too.
Again we find 14 and 10 in an unsorted order.
11
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
We swap them again. By the end of third iteration, we have a sorted sub-list of 4 items.
This process goes on until all the unsorted values are covered in a sorted sub-list. Now we
shall see some programming aspects of insertion sort.
Example 2:
Example 3:
12, 11, 13, 5, 6
Let us loop for i = 1 (second element of the array) to 4 (last element of the array)
i = 1. Since 11 is smaller than 12, move 12 and insert 11 before 12
11, 12, 13, 5, 6
i = 2. 13 will remain at its position as all elements in A[0..I-1] are smaller than 13
11, 12, 13, 5, 6
i = 3. 5 will move to the beginning and all other elements from 11 to 13 will move one
position ahead of their current position.
5, 11, 12, 13, 6
i = 4. 6 will move to position after 5, and elements from 11 to 13 will move one position
ahead of their current position.
5, 6, 11, 12, 13
Algorithm:
Step 1 − If it is the first element, it is already sorted. return 1;
Step 2 − Pick next element
12
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Step 3 − Compare with all elements in the sorted sub-list
Step 4 − Shift all the elements in the sorted sub-list that is greater than the value to be sorted
Step 5 − Insert the value
Step 6 − Repeat until list is sorted
#include<stdio.h>
#include<conio.h>
int main()
{
int n,i,j,arr[100],t,key;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&arr[i]);
for (i = 1; i < n; i++) {
key = arr[i];
j = i - 1;
/* Move elements of arr[0..i-1], that are greater than key, to one position ahead
of their current position */
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j = j - 1;
}
arr[j + 1] = key;
}
printf("After performed the insertion sort the sorted array is : ");
for(i=0;i<n;i++)
printf("%dt",arr[i]);
}
4. Shell Sort
Shell sort is a highly efficient sorting algorithm and is based on insertion sort algorithm.
This algorithm avoids large shifts as in case of insertion sort, if the smaller value is to the
far right and has to be moved to the far left.
This algorithm uses insertion sort on a widely spread elements, first to sort them and then
sorts the less widely spaced elements.
13
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
ShellSort is mainly a variation of Insertion Sort. In insertion sort, we move elements
only one position ahead. When an element has to be moved far ahead, many movements are
involved. The idea of shellSort is to allow exchange of far items.
Insertion sort does not perform well when the close elements are far apart. Shell sort
helps in reducing the distance between the close elements. Thus, there will be less number of
swappings to be performed.
How Shell Sort Works?
1. Suppose, we need to sort the following array.
Initial array
2. We are using the shell's original sequence (N/2, N/4, ...1) as intervals in
our algorithm.
In the first loop, if the array size is N = 8 then, the elements lying at the
interval of N/2 = 4 are compared and swapped if they are not in order.
a. The 0th element is compared with the 4th element.
b. If the 0th element is greater than the 4th one then, the 4th element is first
stored in temp variable and the 0th element (ie. greater element) is stored in
the 4th position and the element stored in temp is stored in the 0th position.
Rearrange the elements at n/2 interval
14
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
This process goes on for all the remaining elements.
Rearrange all the elements at n/2 interval
3. In the second loop, an interval of N/4 = 8/4 = 2 is taken and again the
elements lying at these intervals are sorted.
Rearrange
the elements at n/4 interval
You might get confused at this point.
All the
elements in the array lying at the current interval are compared.
The elements at 4th and 2nd position are compared. The elements
at 2nd and 0th position are also compared. All the elements in the array
lying at the current interval are compared.
4. The same process goes on for remaining elements.
Rearrange all the elements at n/4 interval
15
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
5. Finally, when the interval is N/8 = 8/8 =1 then the array elements lying at the
interval of 1 are sorted. The array is now completely sorted.
Rearrange the elements at n/8 interval
Shell Sort Algorithm
for (interval = n / 2; interval > 0; interval /= 2) {
for (i = interval; i < n; i += 1) {
int temp = a[i];
int j;
for (j = i; j >= interval && a[j - interval] > temp; j -= interval) {
a[j] = a[j - interval];
}
16
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
a[j] = temp;
} }
Source Code:
#include<stdio.h>
#include<conio.h>
int main()
{ int n,i,j,a[100],t, interval;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
for (interval = n / 2; interval > 0; interval /= 2) {
for (i = interval; i < n; i += 1) {
int temp = a[i];
int j;
for (j = i; j >= interval && a[j - interval] > temp; j -= interval) {
a[j] = a[j - interval];
}
a[j] = temp;
}
}
printf("After performed the bubble sort the sorted array is : ");
for(i=0;i<n;i++)
printf("%dt",a[i]);
}
17
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Time Complexity
 Worst Case Complexity: less than or equal to O(n2
)
Worst case complexity for shell sort is always less than or equal to O(n2
).
According to Poonen Theorem, worst case complexity for shell sort
is Θ(Nlog N)2
/(log log N)2
) or Θ(Nlog N)2
/log log N) or Θ(N(log N)2
) or
something in between.
 Best Case Complexity: O(n*log n)
When the array is already sorted, the total number of comparisons for each
interval (or increment) is equal to the size of the array.
 Average Case Complexity: O(n*log n)
It is around O(n1.25
).
Example 2:
18
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
19
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
5. Heap Sort
Heap sort is a comparison based sorting technique based on Binary Heap data structure. It
is similar to selection sort where we first find the maximum element and place the
maximum element at the end. We repeat the same process for the remaining elements.
What is Binary Heap?
Let us first define a Complete Binary Tree. A complete binary tree is a binary tree in which
every level, except possibly the last, is completely filled, and all nodes are as far left as
possible (Source Wikipedia)
A Binary Heap is a Complete Binary Tree where items are stored in a special order such
that value in a parent node is greater(or smaller) than the values in its two children nodes.
The former is called as max heap and the latter is called min-heap. The heap can be
represented by a binary tree or array.
Why array based representation for Binary Heap?
Since a Binary Heap is a Complete Binary Tree, it can be easily represented as an array and
the array-based representation is space-efficient. If the parent node is stored at index I, the
left child can be calculated by 2 * I + 1 and right child by 2 * I + 2 (assuming the indexing
starts at 0).
Heap Sort Algorithm for sorting in increasing order:
1. Build a max heap from the input data.
2. At this point, the largest item is stored at the root of the heap. Replace it with the last
item of the heap followed by reducing the size of heap by 1. Finally, heapify the root of the
tree.
3. Repeat step 2 while size of heap is greater than 1.
How to build the heap?
Heapify procedure can be applied to a node only if its children nodes are heapified. So the
heapification must be performed in the bottom-up order.
Lets understand with the help of an example:
Input data: 4, 10, 3, 5, 1
4(0)
/ 
10(1) 3(2)
/ 
5(3) 1(4)
The numbers in bracket represent the indices in the array
representation of data.
20
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Applying heapify procedure to index 1:
4(0)
/ 
10(1) 3(2)
/ 
5(3) 1(4)
Applying heapify procedure to index 0:
10(0)
/ 
5(1) 3(2)
/ 
4(3) 1(4)
The heapify procedure calls itself recursively to build heap
in top down manner.
Source Code:
#include <stdio.h>
// Function to swap the the position of two elements
void swap(int *a, int *b) {
int temp = *a;
*a = *b;
*b = temp;
}
void heapify(int arr[], int n, int i) {
// Find largest among root, left child and right child
int largest = i;
int left = 2 * i + 1;
int right = 2 * i + 2;
21
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
if (left < n && arr[left] > arr[largest])
largest = left;
if (right < n && arr[right] > arr[largest])
largest = right;
// Swap and continue heapifying if root is not largest
if (largest != i) {
swap(&arr[i], &arr[largest]);
heapify(arr, n, largest);
}
}
// Main function to do heap sort
void heapSort(int arr[], int n) {
// Build max heap
int i;
for (i = n / 2 - 1; i >= 0; i--)
heapify(arr, n, i);
// Heap sort
for (i = n - 1; i >= 0; i--) {
swap(&arr[0], &arr[i]);
// Heapify root element to get highest element at root again
heapify(arr, i, 0);
}
}
// Print an array
void printArray(int arr[], int n) {
int i;
22
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
for (i = 0; i < n; ++i)
printf("%d ", arr[i]);
printf("n");
}
// Driver code
int main() {
// int arr[] = {1, 12, 9, 5, 6, 10};
//int n = sizeof(arr) / sizeof(arr[0]);
int n,i,a[100];
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
heapSort(a, n);
printf("Sorted array is n");
printArray(a, n);
}
23
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Relationship between Array Indexes and Tree
Elements
A complete binary tree has an interesting property that we can use to find
the children and parents of any node.
If the index of any element in the array is i, the element in the
index 2i+1 will become the left child and element in 2i+2 index will become
the right child. Also, the parent of any element at index i is given by the
lower bound of (i-1)/2.
Relationship
betweenarrayandheapindices
Let's test it out,
Left child of 1 (index 0)
= element in (2*0+1) index
= element in 1 index
= 12
Right child of 1
= element in (2*0+2) index
= element in 2 index
= 9
24
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Similarly,
Left child of 12 (index 1)
= element in (2*1+1) index
= element in 3 index
= 5
Right child of 12
= element in (2*1+2) index
= element in 4 index
= 6
Let us also confirm that the rules hold for finding parent of any node
Parent of 9 (position 2)
= (2-1)/2
= ½
= 0.5
~ 0 index
= 1
Parent of 12 (position 1)
= (1-1)/2
= 0 index
= 1
Now let's think of another scenario in which there is more than one level.
25
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
How to heapify root element when its subtrees are
already max heaps
The top element isn't a max-heap but all the sub-trees are max-heaps.
To maintain the max-heap property for the entire tree, we will have to keep
pushing 2 downwards until it reaches its correct position.
How to heapify
root element when its subtrees are max-heaps
26
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Thus, to maintain the max-heap property in a tree where both sub-trees are
max-heaps, we need to run heapify on the root element repeatedly until it is
larger than its children or it becomes a leaf node.
We can combine both these conditions in one heapify function as
void heapify(int arr[], int n, int i) {
// Find largest among root, left child and right child
int largest = i;
int left = 2 * i + 1;
int right = 2 * i + 2;
if (left < n && arr[left] > arr[largest])
largest = left;
if (right < n && arr[right] > arr[largest])
largest = right;
// Swap and continue heapifying if root is not largest
if (largest != i) {
swap(&arr[i], &arr[largest]);
heapify(arr, n, largest);
}
}
This function works for both the base case and for a tree of any size. We
can thus move the root element to the correct position to maintain the max-
heap status for any tree size as long as the sub-trees are max-heaps.
Build max-heap
To build a max-heap from any tree, we can thus start heapifying each sub-
tree from the bottom up and end up with a max-heap after the function is
applied to all the elements including the root element.
27
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
In the case of a complete tree, the first index of a non-leaf node is given
by n/2 - 1. All other nodes after that are leaf-nodes and thus don't need to
be heapified.
So, we can build a maximum heap as
// Build heap (rearrange array)
for (int i = n / 2 - 1; i >= 0; i--)
heapify(arr, n, i);
Create array and calculate i
28
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Steps to build max heap for heap sort
Steps to build max heap for heap sort
29
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Steps to build max heap for heap sort
As shown in the above diagram, we start by heapifying the lowest smallest
trees and gradually move up until we reach the root element.
If you've understood everything till here, congratulations, you are on your
way to mastering the Heap sort.
30
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
How Heap Sort Works?
1. Since the tree satisfies Max-Heap property, then the largest item is stored
at the root node.
2. Swap: Remove the root element and put at the end of the array (nth
position) Put the last item of the tree (heap) at the vacant place.
3. Remove: Reduce the size of the heap by 1.
4. Heapify: Heapify the root element again so that we have the highest
element at root.
5. The process is repeated until all the items of the list are sorted.
31
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Swap,
32
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Remove, and Heapify
The code below shows the operation.
// Heap sort
for (int i = n - 1; i >= 0; i--) {
swap(&arr[0], &arr[i]);
// Heapify root element to get highest element at root again
heapify(arr, i, 0);
}
Heap Sort Complexity
Heap Sort has O(nlog n) time complexities for all the cases ( best case, average case, and
worst case).
Let us understand the reason why. The height of a complete binary tree containing n elements
is log n
6. Merge Sort
Merge sort is a sorting technique based on divide and conquer technique. With worst-
case time complexity being Ο(n log n), it is one of the most respected algorithms.
Merge sort first divides the array into equal halves and then combines them in a sorted
manner.
How Merge Sort Works?
To understand merge sort, we take an unsorted array as the following −
We know that merge sort first divides the whole array iteratively into equal halves unless
the atomic values are achieved. We see here that an array of 8 items is divided into two
arrays of size 4.
This does not change the sequence of appearance of items in the original. Now we divide
these two arrays into halves.
33
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
We further divide these arrays and we achieve atomic value which can no more be
divided.
Now, we combine them in exactly the same manner as they were broken down. Please
note the color codes given to these lists.
We first compare the element for each list and then combine them into another list in a
sorted manner. We see that 14 and 33 are in sorted positions. We compare 27 and 10 and
in the target list of 2 values we put 10 first, followed by 27. We change the order of 19
and 35 whereas 42 and 44 are placed sequentially.
In the next iteration of the combining phase, we compare lists of two data values, and
merge them into a list of found data values placing all in a sorted order.
After the final merging, the list should look like this −
Now we should learn some programming aspects of merge sorting.
Algorithm
Merge sort keeps on dividing the list into equal halves until it can no more be divided. By
definition, if it is only one element in the list, it is sorted. Then, merge sort combines the
smaller sorted lists keeping the new list sorted too.
Step 1 − if it is only one element in the list it is already sorted, return.
Step 2 − divide the list recursively into two halves until it can no more be divided.
Step 3 − merge the smaller lists into new list in sorted order.
Pseudocode
MergeSort(arr[], l, r)
If r > l
1. Find the middle point to divide the array into two halves:
middle m = l+ (r-l)/2
2. Call mergeSort for first half:
Call mergeSort(arr, l, m)
34
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
3. Call mergeSort for second half:
Call mergeSort(arr, m+1, r)
4. Merge the two halves sorted in step 2 and 3:
Call merge(arr, l, m, r)
void mergeSort(int a[], int start, int end)
{
int mid;
if(start < end)
{
mid = (start + end) / 2;
mergeSort(a, start, mid);
mergeSort(a, mid+1, end);
merge(a, start, mid, end);
}
}
35
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Source Code:
#include <stdio.h>
// lets take a[5] = {32, 45, 67, 2, 7} as the array to be sorted.
// merge sort function
void mergeSort(int a[], int start, int end)
{
int mid;
if(start < end)
{
mid = (start + end) / 2;
mergeSort(a, start, mid);
mergeSort(a, mid+1, end);
merge(a, start, mid, end);
}
}
// function to merge the subarrays
void merge(int a[], int start, int mid, int end)
{
int b[100]; //same size of a[]
int i, j, k;
k = 0;
i = start;
j = mid + 1;
while(i <= mid && j <= end)
{
if(a[i] < a[j])
{
b[k++] = a[i++]; // same as b[k]=a[i]; k++; i++;
}
else
{
b[k++] = a[j++];
}
}
while(i <= mid)
{
36
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
b[k++] = a[i++];
}
while(j <= end)
{
b[k++] = a[j++];
}
for(i=end; i >= start; i--)
{
a[i] = b[--k]; // copying back the sorted list to a[]
}
}
// function to print the array
void printArray(int a[], int size)
{
int i;
for (i=0; i < size; i++)
{
printf("%d ", a[i]);
}
printf("n");
}
int main()
{
int n,i,j,a[100],t;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
// calling merge sort
mergeSort(a, 0, n - 1);
printf("nSorted array: n");
printArray(a, n);
return 0;
}
37
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
The merge function works as follows:
1. Create copies of the subarrays L ← A[p..q] and M ← A[q+1..r].
2. Create three pointers i, j and k
a. i maintains current index of L, starting at 1
b. j maintains current index of M, starting at 1
c. k maintains the current index of A[p..q], starting at p.
3. Until we reach the end of either L or M, pick the larger among the elements
from L and M and place them in the correct position at A[p..q]
4. When we run out of elements in either L or M, pick up the remaining elements and put
in A[p..q]
Important Characteristics of Merge Sort:
 Merge Sort is useful for sorting linked lists.
 Merge Sort is a stable sort which means that the same element in an array maintain their original
positions with respect to each other.
 Overall time complexity of Merge sort is O(nLogn). It is more efficient as it is in worst case also
the runtime is O(nlogn)
 The space complexity of Merge sort is O(n). This means that this algorithm takes a lot of space
and may slower down operations for the last data sets.
Recurrence Relation - Merge Sort
T(n) = 2T(n/2) + n
38
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
7. Quick Sort
Like Merge Sort, QuickSort is a Divide and Conquer algorithm. It picks an element as pivot
and partitions the given array around the picked pivot. There are many different versions of
quickSort that pick pivot in different ways.
1. Always pick first element as pivot.
2. Always pick last element as pivot (implemented below)
3. Pick a random element as pivot.
4. Pick median as pivot.
The key process in quickSort is partition(). Target of partitions is, given an array and an
element x of array as pivot, put x at its correct position in sorted array and put all smaller
elements (smaller than x) before x, and put all greater elements (greater than x) after x. All
this should be done in linear time.
Quick Sort Algorithm
Using pivot algorithm recursively, we end up with smaller possible partitions. Each
partition is then processed for quick sort. We define recursive algorithm for quicksort
as follows −
Step 1 − Make the right-most index value pivot
Step 2 − partition the array using pivot value
Step 3 − quicksort left partition recursively
Step 4 − quicksort right partition recursively
Quick Sort Pseudocode
To get more into it, let see the pseudocode for quick sort algorithm −
procedure quickSort(left, right)
if right-left <= 0
return
else
39
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
pivot = A[right]
partition = partitionFunc(left, right, pivot)
quickSort(left,partition-1)
quickSort(partition+1,right)
end if
end procedure
Source Code:
#include<stdio.h>
#include<conio.h>
int a[10];
void main()
{
int a[];
int i,n,l,u;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
qsort(a,0,n-1);
printf("Elements After QuickSort:n");
for(i=0;i<n;i++)
printf("%d",a[i]);
getch();
}
qsort(int a[],int l,int u)
{
int i,j,temp,k;
i=l+1;
40
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
j=u;
k=l;
if(l<u)
{
while(i<=j)
{
while(a[i]<a[k])
i++;
while(a[j]>a[k])
j--;
if(i<j)
{
temp=a[i];
a[i]=a[j];
a[j]=temp;
}
}
temp=a[j];
a[j]=a[k]; //pivot and j element are swapped
a[k]=temp;
qsort(a,l,j-1);
qsort(a,j+1,u);
}
}
Source Code:
#include<stdio.h>
#include<conio.h>
int a[10];
41
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
void swap(int* a, int* b)
{
int t = *a;
*a = *b;
*b = t;
}
void main()
{
int a[100];
int i,n,l,u;
printf("Enter number of elements: "); -
scanf("%d",&n);
printf("Enter the elements into an array: ");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
qsort(a,0,n-1);
printf("Elements After QuickSort:n");
for(i=0;i<n;i++)
printf("%d",a[i]);
getch();
}
qsort(int a[],int l,int u)
{
int i,j,temp,k;
i=l+1;
j=u;
k=l;
if(l<u)
{
while(i<=j)
{
while(a[i]<a[k])
i++;
while(a[j]>a[k])
j--;
if(i<j)
{
swap(&a[i],&a[j]);
}
}
swap(&a[j],&a[k]); //pivot and j element are swapped
qsort(a,l,j-1);
42
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
qsort(a,j+1,u);
}
}
How QuickSort Works?
1. A pivot element is chosen from the array. You can choose any element
from the array as the pivot element.
Here, we have taken the rightmost (ie. the last element) of the array as the
pivot element.
Select a pivot element
2. The elements smaller than the pivot element are put on the left and the
elements greater than the pivot element are put on the right.
Put all the
smaller elements on the left and greater on the right of pivot element
The above arrangement is achieved by the following steps.
a. A pointer is fixed at the pivot element. The pivot element is compared with
the elements beginning from the first index. If the element greater than the
pivot element is reached, a second pointer is set for that element.
b. Now, the pivot element is compared with the other elements (a third
pointer). If an element smaller than the pivot element is reached, the
smaller element is swapped with the greater element found earlier.
43
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Comparison
of pivot element with other elements
c. The process goes on until the second last element is reached.
Finally, the pivot element is swapped with the second pointer.
44
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Swap pivot
element with the second pointer
d. Now the left and right subparts of this pivot element are taken for further
processing in the steps below.
3. Pivot elements are again chosen for the left and the right sub-parts
separately. Within these sub-parts, the pivot elements are placed at their
right position. Then, step 2 is repeated.
45
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Select pivot
element of in each half and put at correct place using recursion
4. The sub-parts are again divided into smaller sub-parts until each subpart is
formed of a single element.
5. At this point, the array is already sorted.
Quicksort uses recursion for sorting the sub-parts.
On the basis of Divide and conquer approach, quicksort algorithm can be explained
as:
 Divide
The array is divided into subparts taking pivot as the partitioning point. The
elements smaller than the pivot are placed to the left of the pivot and the
elements greater than the pivot are placed to the right.
 Conquer
The left and the right subparts are again partitioned using the by selecting pivot
elements for them. This can be achieved by recursively passing the subparts into
the algorithm.
 Combine
This step does not play a significant role in quicksort. The array is already sorted
at the end of the conquer step.
46
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
You can understand the working of quicksort with the help of the illustrations
below.
Sorting the elements on the left of pivot using recursion
Recurrence Relation
Assume we constructed a quicksort and the pivot value takes linear time. Find the
recurrence for worst-case running time.
My answer: T(n)= T(n-1) + T(1) + theta(n)
Worst case occurs when the subarrays are completely unbalanced. There is 1 element
in one subarray and (n-1) elements in the other subarray. theta(n) because it takes
running time n to find the pivot.
Best case
The recurrence relation for quicksort is:
T(n)=2T(n/2)+O(n)
47
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Merge Two Arrays in C
#include<stdio.h>
#include<conio.h>
int main()
{
int arr1[50], arr2[50], size1, size2, i, k, merge[100];
printf("Enter Array 1 Size: ");
scanf("%d", &size1);
printf("Enter Array 1 Elements: ");
for(i=0; i<size1; i++)
{
scanf("%d", &arr1[i]);
merge[i] = arr1[i];
}
k = i;
printf("nEnter Array 2 Size: ");
scanf("%d", &size2);
printf("Enter Array 2 Elements: ");
for(i=0; i<size2; i++)
{
scanf("%d", &arr2[i]);
merge[k] = arr2[i];
k++;
}
printf("nThe new array after merging is:n");
for(i=0; i<k; i++)
printf("%d ", merge[i]);
getch();
return 0;
}
48
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
Radix Sort:
#include<stdio.h>
int get_max (int a[], int n){
int i, max = a[0];
for (i = 1; i < n; i++)
if (a[i] > max)
max = a[i];
return max;
}
void radix_sort (int a[], int n){
int bucket[10][10], bucket_cnt[10];
int i, j, k, r, NOP = 0, divisor = 1, lar, pass;
lar = get_max (a, n);
while (lar > 0){
NOP++;
lar /= 10;
}
for (pass = 0; pass < NOP; pass++){
for (i = 0; i < 10; i++){
bucket_cnt[i] = 0;
}
for (i = 0; i < n; i++){
r = (a[i] / divisor) % 10;
bucket[r][bucket_cnt[r]] = a[i];
bucket_cnt[r] += 1;
}
i = 0;
for (k = 0; k < 10; k++){
for (j = 0; j < bucket_cnt[k]; j++){
49
G BHARATHKUMAR, AssistantProfessor,CSEDepartment
a[i] = bucket[k][j];
i++;
}
}
divisor *= 10;
printf ("After pass %d : ", pass + 1);
for (i = 0; i < n; i++)
printf ("%d ", a[i]);
printf ("n");
}
}
int main (){
int i, n, a[10];
printf ("Enter the number of items to be sorted: ");
scanf ("%d", &n);
printf ("Enter items: ");
for (i = 0; i < n; i++){
scanf ("%d", &a[i]);
}
radix_sort (a, n);
printf ("Sorted items : ");
for (i = 0; i < n; i++)
printf ("%d ", a[i]);
printf ("n");
return 0;
}
50
G BHARATHKUMAR, AssistantProfessor,CSEDepartment

More Related Content

What's hot

Sorting Algorithm
Sorting AlgorithmSorting Algorithm
Sorting Algorithm
Al Amin
 
Sorting algorithm
Sorting algorithmSorting algorithm
Sorting algorithm
Balaji Nangare
 
Sorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha MajumderSorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha Majumder
Ashin Guha Majumder
 
Merge sort analysis and its real time applications
Merge sort analysis and its real time applicationsMerge sort analysis and its real time applications
Merge sort analysis and its real time applications
yazad dumasia
 
Sorting
SortingSorting
Insertion Sorting
Insertion SortingInsertion Sorting
Insertion Sorting
FarihaHabib123
 
Data Structures- Part4 basic sorting algorithms
Data Structures- Part4 basic sorting algorithmsData Structures- Part4 basic sorting algorithms
Data Structures- Part4 basic sorting algorithms
Abdullah Al-hazmy
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
Trupti Agrawal
 
Sorting
SortingSorting
Sorting
Samsil Arefin
 
Data Structures & Algorithm design using C
Data Structures & Algorithm design using C Data Structures & Algorithm design using C
Data Structures & Algorithm design using C
Emertxe Information Technologies Pvt Ltd
 
3.9 external sorting
3.9 external sorting3.9 external sorting
3.9 external sorting
Krish_ver2
 
Sorting
SortingSorting
Sorting
Gopi Saiteja
 
Implementing Merge Sort
Implementing Merge SortImplementing Merge Sort
Implementing Merge Sort
smita gupta
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
Eleonora Ciceri
 
Searching & Sorting Algorithms
Searching & Sorting AlgorithmsSearching & Sorting Algorithms
Searching & Sorting Algorithms
Rahul Jamwal
 
SEARCHING AND SORTING ALGORITHMS
SEARCHING AND SORTING ALGORITHMSSEARCHING AND SORTING ALGORITHMS
SEARCHING AND SORTING ALGORITHMS
Gokul Hari
 
Sorting Algorithms
Sorting AlgorithmsSorting Algorithms
Sorting Algorithms
Mohammed Hussein
 
Sorting And Type of Sorting
Sorting And Type of SortingSorting And Type of Sorting
Sorting And Type of Sorting
MITULJAMANG
 
Selection sorting
Selection sortingSelection sorting
Selection sorting
Himanshu Kesharwani
 

What's hot (20)

Sorting Algorithm
Sorting AlgorithmSorting Algorithm
Sorting Algorithm
 
Sorting algorithm
Sorting algorithmSorting algorithm
Sorting algorithm
 
Sorting
SortingSorting
Sorting
 
Sorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha MajumderSorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha Majumder
 
Merge sort analysis and its real time applications
Merge sort analysis and its real time applicationsMerge sort analysis and its real time applications
Merge sort analysis and its real time applications
 
Sorting
SortingSorting
Sorting
 
Insertion Sorting
Insertion SortingInsertion Sorting
Insertion Sorting
 
Data Structures- Part4 basic sorting algorithms
Data Structures- Part4 basic sorting algorithmsData Structures- Part4 basic sorting algorithms
Data Structures- Part4 basic sorting algorithms
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
 
Sorting
SortingSorting
Sorting
 
Data Structures & Algorithm design using C
Data Structures & Algorithm design using C Data Structures & Algorithm design using C
Data Structures & Algorithm design using C
 
3.9 external sorting
3.9 external sorting3.9 external sorting
3.9 external sorting
 
Sorting
SortingSorting
Sorting
 
Implementing Merge Sort
Implementing Merge SortImplementing Merge Sort
Implementing Merge Sort
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
 
Searching & Sorting Algorithms
Searching & Sorting AlgorithmsSearching & Sorting Algorithms
Searching & Sorting Algorithms
 
SEARCHING AND SORTING ALGORITHMS
SEARCHING AND SORTING ALGORITHMSSEARCHING AND SORTING ALGORITHMS
SEARCHING AND SORTING ALGORITHMS
 
Sorting Algorithms
Sorting AlgorithmsSorting Algorithms
Sorting Algorithms
 
Sorting And Type of Sorting
Sorting And Type of SortingSorting And Type of Sorting
Sorting And Type of Sorting
 
Selection sorting
Selection sortingSelection sorting
Selection sorting
 

Similar to Sorting

Data Structures 6
Data Structures 6Data Structures 6
Data Structures 6
Dr.Umadevi V
 
my docoment
my docomentmy docoment
my docoment
NeeshanYonzan
 
Selection sort
Selection sortSelection sort
Selection sortasra khan
 
Selection sort lab mannual
Selection sort lab mannualSelection sort lab mannual
Selection sort lab mannual
maamir farooq
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
CHANDAN KUMAR
 
Data Structures_ Sorting & Searching
Data Structures_ Sorting & SearchingData Structures_ Sorting & Searching
Data Structures_ Sorting & Searching
ThenmozhiK5
 
DS - Unit 2 FINAL (2).pptx
DS - Unit 2 FINAL (2).pptxDS - Unit 2 FINAL (2).pptx
DS - Unit 2 FINAL (2).pptx
prakashvs7
 
Ijcse13 05-01-048
Ijcse13 05-01-048Ijcse13 05-01-048
Ijcse13 05-01-048vital vital
 
Ijcse13 05-01-048
Ijcse13 05-01-048Ijcse13 05-01-048
Ijcse13 05-01-048vital vital
 
Advanced s and s algorithm.ppt
Advanced s and s algorithm.pptAdvanced s and s algorithm.ppt
Advanced s and s algorithm.ppt
LegesseSamuel
 
Sorting algorithums > Data Structures & Algorithums
Sorting algorithums  > Data Structures & AlgorithumsSorting algorithums  > Data Structures & Algorithums
Sorting algorithums > Data Structures & AlgorithumsAin-ul-Moiz Khawaja
 
Sorting
SortingSorting
Selection Sort with Improved Asymptotic Time Bounds
Selection Sort with Improved Asymptotic Time BoundsSelection Sort with Improved Asymptotic Time Bounds
Selection Sort with Improved Asymptotic Time Bounds
theijes
 
Chapter 8 advanced sorting and hashing for print
Chapter 8 advanced sorting and hashing for printChapter 8 advanced sorting and hashing for print
Chapter 8 advanced sorting and hashing for print
Abdii Rashid
 
Best,worst,average case .17581556 045
Best,worst,average case .17581556 045Best,worst,average case .17581556 045
Best,worst,average case .17581556 045
university of Gujrat, pakistan
 
Selection_Sort-CSI (For Sharing and General )
Selection_Sort-CSI (For Sharing and General )Selection_Sort-CSI (For Sharing and General )
Selection_Sort-CSI (For Sharing and General )
phukak12345
 
Daa chapter5
Daa chapter5Daa chapter5
Daa chapter5
B.Kirron Reddi
 
L1803016468
L1803016468L1803016468
L1803016468
IOSR Journals
 
Data structures arrays
Data structures   arraysData structures   arrays
Data structures arrays
maamir farooq
 
Ch 1 intriductions
Ch 1 intriductionsCh 1 intriductions
Ch 1 intriductionsirshad17
 

Similar to Sorting (20)

Data Structures 6
Data Structures 6Data Structures 6
Data Structures 6
 
my docoment
my docomentmy docoment
my docoment
 
Selection sort
Selection sortSelection sort
Selection sort
 
Selection sort lab mannual
Selection sort lab mannualSelection sort lab mannual
Selection sort lab mannual
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
 
Data Structures_ Sorting & Searching
Data Structures_ Sorting & SearchingData Structures_ Sorting & Searching
Data Structures_ Sorting & Searching
 
DS - Unit 2 FINAL (2).pptx
DS - Unit 2 FINAL (2).pptxDS - Unit 2 FINAL (2).pptx
DS - Unit 2 FINAL (2).pptx
 
Ijcse13 05-01-048
Ijcse13 05-01-048Ijcse13 05-01-048
Ijcse13 05-01-048
 
Ijcse13 05-01-048
Ijcse13 05-01-048Ijcse13 05-01-048
Ijcse13 05-01-048
 
Advanced s and s algorithm.ppt
Advanced s and s algorithm.pptAdvanced s and s algorithm.ppt
Advanced s and s algorithm.ppt
 
Sorting algorithums > Data Structures & Algorithums
Sorting algorithums  > Data Structures & AlgorithumsSorting algorithums  > Data Structures & Algorithums
Sorting algorithums > Data Structures & Algorithums
 
Sorting
SortingSorting
Sorting
 
Selection Sort with Improved Asymptotic Time Bounds
Selection Sort with Improved Asymptotic Time BoundsSelection Sort with Improved Asymptotic Time Bounds
Selection Sort with Improved Asymptotic Time Bounds
 
Chapter 8 advanced sorting and hashing for print
Chapter 8 advanced sorting and hashing for printChapter 8 advanced sorting and hashing for print
Chapter 8 advanced sorting and hashing for print
 
Best,worst,average case .17581556 045
Best,worst,average case .17581556 045Best,worst,average case .17581556 045
Best,worst,average case .17581556 045
 
Selection_Sort-CSI (For Sharing and General )
Selection_Sort-CSI (For Sharing and General )Selection_Sort-CSI (For Sharing and General )
Selection_Sort-CSI (For Sharing and General )
 
Daa chapter5
Daa chapter5Daa chapter5
Daa chapter5
 
L1803016468
L1803016468L1803016468
L1803016468
 
Data structures arrays
Data structures   arraysData structures   arrays
Data structures arrays
 
Ch 1 intriductions
Ch 1 intriductionsCh 1 intriductions
Ch 1 intriductions
 

More from BHARATH KUMAR

Object-Oriented concepts.pptx
Object-Oriented concepts.pptxObject-Oriented concepts.pptx
Object-Oriented concepts.pptx
BHARATH KUMAR
 
Java buzzwords.pptx
Java buzzwords.pptxJava buzzwords.pptx
Java buzzwords.pptx
BHARATH KUMAR
 
history and evaluation of java.pptx
history and evaluation of java.pptxhistory and evaluation of java.pptx
history and evaluation of java.pptx
BHARATH KUMAR
 
Data Models
Data ModelsData Models
Data Models
BHARATH KUMAR
 
Structure of a DBMS/Architecture of a DBMS
Structure of a DBMS/Architecture of a DBMSStructure of a DBMS/Architecture of a DBMS
Structure of a DBMS/Architecture of a DBMS
BHARATH KUMAR
 
DBMS languages/ Types of SQL Commands
DBMS languages/ Types of SQL CommandsDBMS languages/ Types of SQL Commands
DBMS languages/ Types of SQL Commands
BHARATH KUMAR
 
1.4 data independence
1.4 data independence1.4 data independence
1.4 data independence
BHARATH KUMAR
 
data abstraction in DBMS
data abstraction in DBMSdata abstraction in DBMS
data abstraction in DBMS
BHARATH KUMAR
 
File system vs DBMS
File system vs DBMSFile system vs DBMS
File system vs DBMS
BHARATH KUMAR
 
DBMS introduction
DBMS introductionDBMS introduction
DBMS introduction
BHARATH KUMAR
 
Trees and Graphs in data structures and Algorithms
Trees and Graphs in data structures and AlgorithmsTrees and Graphs in data structures and Algorithms
Trees and Graphs in data structures and Algorithms
BHARATH KUMAR
 
Linked List
Linked ListLinked List
Linked List
BHARATH KUMAR
 
ADT STACK and Queues
ADT STACK and QueuesADT STACK and Queues
ADT STACK and Queues
BHARATH KUMAR
 
Why we study LMC? by GOWRU BHARATH KUMAR
Why we study LMC? by GOWRU BHARATH KUMARWhy we study LMC? by GOWRU BHARATH KUMAR
Why we study LMC? by GOWRU BHARATH KUMAR
BHARATH KUMAR
 
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMARIntroduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
BHARATH KUMAR
 
A Survey on Big Data Analytics
A Survey on Big Data AnalyticsA Survey on Big Data Analytics
A Survey on Big Data Analytics
BHARATH KUMAR
 
Relation between Languages, Machines and Computations
Relation between Languages, Machines and ComputationsRelation between Languages, Machines and Computations
Relation between Languages, Machines and Computations
BHARATH KUMAR
 

More from BHARATH KUMAR (17)

Object-Oriented concepts.pptx
Object-Oriented concepts.pptxObject-Oriented concepts.pptx
Object-Oriented concepts.pptx
 
Java buzzwords.pptx
Java buzzwords.pptxJava buzzwords.pptx
Java buzzwords.pptx
 
history and evaluation of java.pptx
history and evaluation of java.pptxhistory and evaluation of java.pptx
history and evaluation of java.pptx
 
Data Models
Data ModelsData Models
Data Models
 
Structure of a DBMS/Architecture of a DBMS
Structure of a DBMS/Architecture of a DBMSStructure of a DBMS/Architecture of a DBMS
Structure of a DBMS/Architecture of a DBMS
 
DBMS languages/ Types of SQL Commands
DBMS languages/ Types of SQL CommandsDBMS languages/ Types of SQL Commands
DBMS languages/ Types of SQL Commands
 
1.4 data independence
1.4 data independence1.4 data independence
1.4 data independence
 
data abstraction in DBMS
data abstraction in DBMSdata abstraction in DBMS
data abstraction in DBMS
 
File system vs DBMS
File system vs DBMSFile system vs DBMS
File system vs DBMS
 
DBMS introduction
DBMS introductionDBMS introduction
DBMS introduction
 
Trees and Graphs in data structures and Algorithms
Trees and Graphs in data structures and AlgorithmsTrees and Graphs in data structures and Algorithms
Trees and Graphs in data structures and Algorithms
 
Linked List
Linked ListLinked List
Linked List
 
ADT STACK and Queues
ADT STACK and QueuesADT STACK and Queues
ADT STACK and Queues
 
Why we study LMC? by GOWRU BHARATH KUMAR
Why we study LMC? by GOWRU BHARATH KUMARWhy we study LMC? by GOWRU BHARATH KUMAR
Why we study LMC? by GOWRU BHARATH KUMAR
 
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMARIntroduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
Introduction of Data Structures and Algorithms by GOWRU BHARATH KUMAR
 
A Survey on Big Data Analytics
A Survey on Big Data AnalyticsA Survey on Big Data Analytics
A Survey on Big Data Analytics
 
Relation between Languages, Machines and Computations
Relation between Languages, Machines and ComputationsRelation between Languages, Machines and Computations
Relation between Languages, Machines and Computations
 

Recently uploaded

DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
fxintegritypublishin
 
block diagram and signal flow graph representation
block diagram and signal flow graph representationblock diagram and signal flow graph representation
block diagram and signal flow graph representation
Divya Somashekar
 
WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234
AafreenAbuthahir2
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdf
AhmedHussein950959
 
TECHNICAL TRAINING MANUAL GENERAL FAMILIARIZATION COURSE
TECHNICAL TRAINING MANUAL   GENERAL FAMILIARIZATION COURSETECHNICAL TRAINING MANUAL   GENERAL FAMILIARIZATION COURSE
TECHNICAL TRAINING MANUAL GENERAL FAMILIARIZATION COURSE
DuvanRamosGarzon1
 
addressing modes in computer architecture
addressing modes  in computer architectureaddressing modes  in computer architecture
addressing modes in computer architecture
ShahidSultan24
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
Jayaprasanna4
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
gerogepatton
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
ViniHema
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
ethical hacking in wireless-hacking1.ppt
ethical hacking in wireless-hacking1.pptethical hacking in wireless-hacking1.ppt
ethical hacking in wireless-hacking1.ppt
Jayaprasanna4
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
bakpo1
 
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdfCOLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
Kamal Acharya
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
AJAYKUMARPUND1
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
VENKATESHvenky89705
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
seandesed
 

Recently uploaded (20)

DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
 
block diagram and signal flow graph representation
block diagram and signal flow graph representationblock diagram and signal flow graph representation
block diagram and signal flow graph representation
 
WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdf
 
TECHNICAL TRAINING MANUAL GENERAL FAMILIARIZATION COURSE
TECHNICAL TRAINING MANUAL   GENERAL FAMILIARIZATION COURSETECHNICAL TRAINING MANUAL   GENERAL FAMILIARIZATION COURSE
TECHNICAL TRAINING MANUAL GENERAL FAMILIARIZATION COURSE
 
addressing modes in computer architecture
addressing modes  in computer architectureaddressing modes  in computer architecture
addressing modes in computer architecture
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
ethical hacking in wireless-hacking1.ppt
ethical hacking in wireless-hacking1.pptethical hacking in wireless-hacking1.ppt
ethical hacking in wireless-hacking1.ppt
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
 
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdfCOLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdf
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
 

Sorting

  • 1. 1 G BHARATHKUMAR, AssistantProfessor,CSEDepartment MODULE-4 Objectives of Sorting: The objective of the sorting algorithm is to rearrange the records so that their keys are ordered according to some well-defined ordering rule. Problem: Given an array of n real number A[1.. n]. Objective: Sort the elements of A in ascending order of their values. Internal Sort If the file to be sorted will fit into memory or equivalently if it will fit into an array, then the sorting method is called internal. In this method, any record can be accessed easily. External Sort  Sorting files from tape or disk.  In this method, an external sort algorithm must access records sequentially, or at least in the block. Memory Requirement 1. Sort in place and use no extra memory except perhaps for a small stack or table. 2. Algorithm that use a linked-list representation and so use N extra words of memory for list pointers. 3. Algorithms that need enough extra memory space to hold another copy of the array to be sorted. Stability A sorting algorithm is called stable if it is preserves the relative order of equal keys in the file. Most of the simple algorithm are stable, but most of the well-known sophisticated algorithms are not. There are two classes of sorting algorithms namely, O(n2)-algorithms and O(n log n)- algorithms. O(n2)-class includes bubble sort, insertion sort, selection sort and shell sort. O(n log n)-class includes heap sort, merge sort and quick sort. O(n2) Sorting Algorithms
  • 2. 2 G BHARATHKUMAR, AssistantProfessor,CSEDepartment O(n log n) Sorting Algorithms SORTING PROPERTIES Sorting Properties Property Description Adaptive A sort is adaptive if it runs faster on a partially sorted array. Stable A sort is stable if it preserves the relative order of equal keys in the database. In Situ An in situ (“in place”) sort moves the items within the array itself and, thus, requires only a small O(1) amount of extra storage. Online An online sort can process its data piece-by-piece in serial fashion without having the entire array available from the beginning of the algorithm.
  • 3. 3 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Properties Of Sorting Algorithms Adaptive Stable In Situ Online Linear Insertion Yes Yes Yes Yes Mergesort No Yes No Yes Quicksort No† No Yes No †Quicksort actually runs more slowly on a partially sorted array. Runtime Properties Of Sorting Algorithms Linear Insertion o Average case Ο(n2) o Worst-case Ο(n2) o Runs in O(n) time on a sorted array Mergesort o Average case Ο(n lg n) o Worst-case Ο(n lg n) o Runtime is not affected by the array contents, only the array size Quicksort o Average case Ο(n lg n) o Worst-case Ο(n2) on a sorted array o Median-of-three partitioning guarantees Ο(n lg n) runtime Choosing a Sorting Algorithm To choose a sorting algorithm for a particular problem, consider the running time, space complexity, and the expected format of the input list. Algorithm Best-case Worst-case Average-case Space Complexity Stable? Merge Sort O(n log n)O(nlogn) O(n log n)O(nlogn) O(n log n)O(nlogn) O(n)O(n) Yes Insertion Sort O(n)O(n) O(n^2)O(n2) O(n^2)O(n2) O(1)O(1) Yes
  • 4. 4 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Bubble Sort O(n)O(n) O(n^2)O(n2) O(n^2)O(n2) O(1)O(1) Yes Quicksort O(n log n)O(nlogn) O(n^2)O(n2) O(n log n)O(nlogn) log nlogn best, nn avg Usuall y not* Heapsort O(n log n)O(nlogn) O(n log n)O(nlogn) O(n log n)O(nlogn) O(1)O(1) No Counting Sort O(k+n)O(k+n) O(k+n)O(k+n) O(k+n)O(k+n) O(k+n)O(k+n) Yes *Most quicksort implementations are not stable, though stable implementations do exist. When choosing a sorting algorithm to use, weigh these factors. For example, quicksort is a very fast algorithm but can be pretty tricky to implement; bubble sort is a slow algorithm but is very easy to implement. To sort small sets of data, bubble sort may be a better option since it can be implemented quickly, but for larger datasets, the speedup from quicksort might be worth the trouble implementing the algorithm. 1. Bubble sort It is also called as exchange sort. In this sort the comparison of adjacent elements is done whenever the 1st element is greater than the 2nd element. Then the swapping will be done. For any kind of sorting if we have n number elements there will be n-1 iterations. The efficiency of bubble sort is O(n). Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping the adjacent elements if they are in wrong order. Example: First Pass: ( 5 1 4 2 8 ) –> ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps since 5 > 1. ( 1 5 4 2 8 ) –> ( 1 4 5 2 8 ), Swap since 5 > 4 ( 1 4 5 2 8 ) –> ( 1 4 2 5 8 ), Swap since 5 > 2 ( 1 4 2 5 8 ) –> ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does not swap them. Second Pass: ( 1 4 2 5 8 ) –> ( 1 4 2 5 8 ) ( 1 4 2 5 8 ) –> ( 1 2 4 5 8 ), Swap since 4 > 2
  • 5. 5 G BHARATHKUMAR, AssistantProfessor,CSEDepartment ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) Now, the array is already sorted, but our algorithm does not know if it is completed. The algorithm needs one whole pass without any swap to know it is sorted. Third Pass: ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) ( 1 2 4 5 8 ) –> ( 1 2 4 5 8 ) // C program for implementation of Bubble sort #include<stdio.h> #include<conio.h> int main() { int n,i,j,a[100],t; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); for(i=0;i<n;i++) { for(j=0;j<n-i-1;j++) { if(a[j]>a[j+1]) { t=a[j]; a[j]=a[j+1]; a[j+1]=t; } } } printf("After performed the bubble sort the sorted array is : "); for(i=0;i<n;i++) printf("%dt",a[i]); } Worst and Average Case Time Complexity: O(n*n). Worst case occurs when array is reverse sorted. Best Case Time Complexity: O(n). Best case occurs when array is already sorted. Auxiliary Space: O(1) Boundary Cases: Bubble sort takes minimum time (Order of n) when elements are already sorted. Sorting In Place: Yes
  • 6. 6 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Stable: Yes Due to its simplicity, bubble sort is often used to introduce the concept of a sorting algorithm. In computer graphics it is popular for its capability to detect a very small error (like swap of just two elements) in almost-sorted arrays and fix it with just linear complexity (2n). For example, it is used in a polygon filling algorithm, where bounding lines are sorted by their x coordinate at a specific scan line (a line parallel to x axis) and with incrementing y their order changes (two elements are swapped) only at intersections of two lines Recurrence form of Bubble-Sort T(n)=T(n-1) + n 2. Selection Sort:  Selection sort is a simple sorting algorithm. This sorting algorithm is an in-place comparison-based algorithm in which the list is divided into two parts, the sorted part at the left end and the unsorted part at the right end. Initially, the sorted part is empty and the unsorted part is the entire list.  The smallest element is selected from the unsorted array and swapped with the leftmost element, and that element becomes a part of the sorted array. This process continues moving unsorted array boundary by one element to the right.  This algorithm is not suitable for large data sets as its average and worst case complexities are of Ο(n2), where n is the number of items. Algorithm for Selection Sort: Step 1 − Set min to the first location Step 2 − Search the minimum element in the array Step 3 – swap the first location with the minimum value in the array Step 4 – assign the second element as min. Step 5 − Repeat the process until we get a sorted array. Source Code: #include<stdio.h> #include<conio.h> int main() { int n,i,a[100],j,min,t; printf("enter the number of elements: "); scanf("%d",&n); printf("enter the elements into an array: ");
  • 8. 8 G BHARATHKUMAR, AssistantProfessor,CSEDepartment How Selection Sort Works? Consider the following depicted array as an example. For the first position in the sorted list, the whole list is scanned sequentially. The first position where 14 is stored presently, we search the whole list and find that 10 is the lowest value. So we replace 14 with 10. After one iteration 10, which happens to be the minimum value in the list, appears in the first position of the sorted list. For the second position, where 33 is residing, we start scanning the rest of the list in a linear manner. We find that 14 is the second lowest value in the list and it should appear at the second place. We swap these values. After two iterations, two least values are positioned at the beginning in a sorted manner. The same process is applied to the rest of the items in the array.Following is a pictorial depiction of the entire sorting process
  • 9. 9 G BHARATHKUMAR, AssistantProfessor,CSEDepartment recurrence relation of selection sort T(n)=T(n-1) + n-1 3. Insertion sort Insertion sort is a simple sorting algorithm that works similar to the way you sort playing cards in your hands. The array is virtually split into a sorted and an unsorted part. Values from the unsorted part are picked and placed at the correct position in the sorted part. Example: How Insertion Sort Works? We take an unsorted array for our example. Insertion sort compares the first two elements. It finds that both 14 and 33 are already in ascending order. For now, 14 is in sorted sub-list.
  • 10. 10 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Insertion sort moves ahead and compares 33 with 27. And finds that 33 is not in the correct position. It swaps 33 with 27. It also checks with all the elements of sorted sub-list. Here we see that the sorted sub-list has only one element 14, and 27 is greater than 14. Hence, the sorted sub- list remains sorted after swapping. By now we have 14 and 27 in the sorted sub-list. Next, it compares 33 with 10. These values are not in a sorted order. So we swap them. However, swapping makes 27 and 10 unsorted. Hence, we swap them too. Again we find 14 and 10 in an unsorted order.
  • 11. 11 G BHARATHKUMAR, AssistantProfessor,CSEDepartment We swap them again. By the end of third iteration, we have a sorted sub-list of 4 items. This process goes on until all the unsorted values are covered in a sorted sub-list. Now we shall see some programming aspects of insertion sort. Example 2: Example 3: 12, 11, 13, 5, 6 Let us loop for i = 1 (second element of the array) to 4 (last element of the array) i = 1. Since 11 is smaller than 12, move 12 and insert 11 before 12 11, 12, 13, 5, 6 i = 2. 13 will remain at its position as all elements in A[0..I-1] are smaller than 13 11, 12, 13, 5, 6 i = 3. 5 will move to the beginning and all other elements from 11 to 13 will move one position ahead of their current position. 5, 11, 12, 13, 6 i = 4. 6 will move to position after 5, and elements from 11 to 13 will move one position ahead of their current position. 5, 6, 11, 12, 13 Algorithm: Step 1 − If it is the first element, it is already sorted. return 1; Step 2 − Pick next element
  • 12. 12 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Step 3 − Compare with all elements in the sorted sub-list Step 4 − Shift all the elements in the sorted sub-list that is greater than the value to be sorted Step 5 − Insert the value Step 6 − Repeat until list is sorted #include<stdio.h> #include<conio.h> int main() { int n,i,j,arr[100],t,key; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&arr[i]); for (i = 1; i < n; i++) { key = arr[i]; j = i - 1; /* Move elements of arr[0..i-1], that are greater than key, to one position ahead of their current position */ while (j >= 0 && arr[j] > key) { arr[j + 1] = arr[j]; j = j - 1; } arr[j + 1] = key; } printf("After performed the insertion sort the sorted array is : "); for(i=0;i<n;i++) printf("%dt",arr[i]); } 4. Shell Sort Shell sort is a highly efficient sorting algorithm and is based on insertion sort algorithm. This algorithm avoids large shifts as in case of insertion sort, if the smaller value is to the far right and has to be moved to the far left. This algorithm uses insertion sort on a widely spread elements, first to sort them and then sorts the less widely spaced elements.
  • 13. 13 G BHARATHKUMAR, AssistantProfessor,CSEDepartment ShellSort is mainly a variation of Insertion Sort. In insertion sort, we move elements only one position ahead. When an element has to be moved far ahead, many movements are involved. The idea of shellSort is to allow exchange of far items. Insertion sort does not perform well when the close elements are far apart. Shell sort helps in reducing the distance between the close elements. Thus, there will be less number of swappings to be performed. How Shell Sort Works? 1. Suppose, we need to sort the following array. Initial array 2. We are using the shell's original sequence (N/2, N/4, ...1) as intervals in our algorithm. In the first loop, if the array size is N = 8 then, the elements lying at the interval of N/2 = 4 are compared and swapped if they are not in order. a. The 0th element is compared with the 4th element. b. If the 0th element is greater than the 4th one then, the 4th element is first stored in temp variable and the 0th element (ie. greater element) is stored in the 4th position and the element stored in temp is stored in the 0th position. Rearrange the elements at n/2 interval
  • 14. 14 G BHARATHKUMAR, AssistantProfessor,CSEDepartment This process goes on for all the remaining elements. Rearrange all the elements at n/2 interval 3. In the second loop, an interval of N/4 = 8/4 = 2 is taken and again the elements lying at these intervals are sorted. Rearrange the elements at n/4 interval You might get confused at this point. All the elements in the array lying at the current interval are compared. The elements at 4th and 2nd position are compared. The elements at 2nd and 0th position are also compared. All the elements in the array lying at the current interval are compared. 4. The same process goes on for remaining elements. Rearrange all the elements at n/4 interval
  • 15. 15 G BHARATHKUMAR, AssistantProfessor,CSEDepartment 5. Finally, when the interval is N/8 = 8/8 =1 then the array elements lying at the interval of 1 are sorted. The array is now completely sorted. Rearrange the elements at n/8 interval Shell Sort Algorithm for (interval = n / 2; interval > 0; interval /= 2) { for (i = interval; i < n; i += 1) { int temp = a[i]; int j; for (j = i; j >= interval && a[j - interval] > temp; j -= interval) { a[j] = a[j - interval]; }
  • 16. 16 G BHARATHKUMAR, AssistantProfessor,CSEDepartment a[j] = temp; } } Source Code: #include<stdio.h> #include<conio.h> int main() { int n,i,j,a[100],t, interval; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); for (interval = n / 2; interval > 0; interval /= 2) { for (i = interval; i < n; i += 1) { int temp = a[i]; int j; for (j = i; j >= interval && a[j - interval] > temp; j -= interval) { a[j] = a[j - interval]; } a[j] = temp; } } printf("After performed the bubble sort the sorted array is : "); for(i=0;i<n;i++) printf("%dt",a[i]); }
  • 17. 17 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Time Complexity  Worst Case Complexity: less than or equal to O(n2 ) Worst case complexity for shell sort is always less than or equal to O(n2 ). According to Poonen Theorem, worst case complexity for shell sort is Θ(Nlog N)2 /(log log N)2 ) or Θ(Nlog N)2 /log log N) or Θ(N(log N)2 ) or something in between.  Best Case Complexity: O(n*log n) When the array is already sorted, the total number of comparisons for each interval (or increment) is equal to the size of the array.  Average Case Complexity: O(n*log n) It is around O(n1.25 ). Example 2:
  • 19. 19 G BHARATHKUMAR, AssistantProfessor,CSEDepartment 5. Heap Sort Heap sort is a comparison based sorting technique based on Binary Heap data structure. It is similar to selection sort where we first find the maximum element and place the maximum element at the end. We repeat the same process for the remaining elements. What is Binary Heap? Let us first define a Complete Binary Tree. A complete binary tree is a binary tree in which every level, except possibly the last, is completely filled, and all nodes are as far left as possible (Source Wikipedia) A Binary Heap is a Complete Binary Tree where items are stored in a special order such that value in a parent node is greater(or smaller) than the values in its two children nodes. The former is called as max heap and the latter is called min-heap. The heap can be represented by a binary tree or array. Why array based representation for Binary Heap? Since a Binary Heap is a Complete Binary Tree, it can be easily represented as an array and the array-based representation is space-efficient. If the parent node is stored at index I, the left child can be calculated by 2 * I + 1 and right child by 2 * I + 2 (assuming the indexing starts at 0). Heap Sort Algorithm for sorting in increasing order: 1. Build a max heap from the input data. 2. At this point, the largest item is stored at the root of the heap. Replace it with the last item of the heap followed by reducing the size of heap by 1. Finally, heapify the root of the tree. 3. Repeat step 2 while size of heap is greater than 1. How to build the heap? Heapify procedure can be applied to a node only if its children nodes are heapified. So the heapification must be performed in the bottom-up order. Lets understand with the help of an example: Input data: 4, 10, 3, 5, 1 4(0) / 10(1) 3(2) / 5(3) 1(4) The numbers in bracket represent the indices in the array representation of data.
  • 20. 20 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Applying heapify procedure to index 1: 4(0) / 10(1) 3(2) / 5(3) 1(4) Applying heapify procedure to index 0: 10(0) / 5(1) 3(2) / 4(3) 1(4) The heapify procedure calls itself recursively to build heap in top down manner. Source Code: #include <stdio.h> // Function to swap the the position of two elements void swap(int *a, int *b) { int temp = *a; *a = *b; *b = temp; } void heapify(int arr[], int n, int i) { // Find largest among root, left child and right child int largest = i; int left = 2 * i + 1; int right = 2 * i + 2;
  • 21. 21 G BHARATHKUMAR, AssistantProfessor,CSEDepartment if (left < n && arr[left] > arr[largest]) largest = left; if (right < n && arr[right] > arr[largest]) largest = right; // Swap and continue heapifying if root is not largest if (largest != i) { swap(&arr[i], &arr[largest]); heapify(arr, n, largest); } } // Main function to do heap sort void heapSort(int arr[], int n) { // Build max heap int i; for (i = n / 2 - 1; i >= 0; i--) heapify(arr, n, i); // Heap sort for (i = n - 1; i >= 0; i--) { swap(&arr[0], &arr[i]); // Heapify root element to get highest element at root again heapify(arr, i, 0); } } // Print an array void printArray(int arr[], int n) { int i;
  • 22. 22 G BHARATHKUMAR, AssistantProfessor,CSEDepartment for (i = 0; i < n; ++i) printf("%d ", arr[i]); printf("n"); } // Driver code int main() { // int arr[] = {1, 12, 9, 5, 6, 10}; //int n = sizeof(arr) / sizeof(arr[0]); int n,i,a[100]; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); heapSort(a, n); printf("Sorted array is n"); printArray(a, n); }
  • 23. 23 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Relationship between Array Indexes and Tree Elements A complete binary tree has an interesting property that we can use to find the children and parents of any node. If the index of any element in the array is i, the element in the index 2i+1 will become the left child and element in 2i+2 index will become the right child. Also, the parent of any element at index i is given by the lower bound of (i-1)/2. Relationship betweenarrayandheapindices Let's test it out, Left child of 1 (index 0) = element in (2*0+1) index = element in 1 index = 12 Right child of 1 = element in (2*0+2) index = element in 2 index = 9
  • 24. 24 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Similarly, Left child of 12 (index 1) = element in (2*1+1) index = element in 3 index = 5 Right child of 12 = element in (2*1+2) index = element in 4 index = 6 Let us also confirm that the rules hold for finding parent of any node Parent of 9 (position 2) = (2-1)/2 = ½ = 0.5 ~ 0 index = 1 Parent of 12 (position 1) = (1-1)/2 = 0 index = 1 Now let's think of another scenario in which there is more than one level.
  • 25. 25 G BHARATHKUMAR, AssistantProfessor,CSEDepartment How to heapify root element when its subtrees are already max heaps The top element isn't a max-heap but all the sub-trees are max-heaps. To maintain the max-heap property for the entire tree, we will have to keep pushing 2 downwards until it reaches its correct position. How to heapify root element when its subtrees are max-heaps
  • 26. 26 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Thus, to maintain the max-heap property in a tree where both sub-trees are max-heaps, we need to run heapify on the root element repeatedly until it is larger than its children or it becomes a leaf node. We can combine both these conditions in one heapify function as void heapify(int arr[], int n, int i) { // Find largest among root, left child and right child int largest = i; int left = 2 * i + 1; int right = 2 * i + 2; if (left < n && arr[left] > arr[largest]) largest = left; if (right < n && arr[right] > arr[largest]) largest = right; // Swap and continue heapifying if root is not largest if (largest != i) { swap(&arr[i], &arr[largest]); heapify(arr, n, largest); } } This function works for both the base case and for a tree of any size. We can thus move the root element to the correct position to maintain the max- heap status for any tree size as long as the sub-trees are max-heaps. Build max-heap To build a max-heap from any tree, we can thus start heapifying each sub- tree from the bottom up and end up with a max-heap after the function is applied to all the elements including the root element.
  • 27. 27 G BHARATHKUMAR, AssistantProfessor,CSEDepartment In the case of a complete tree, the first index of a non-leaf node is given by n/2 - 1. All other nodes after that are leaf-nodes and thus don't need to be heapified. So, we can build a maximum heap as // Build heap (rearrange array) for (int i = n / 2 - 1; i >= 0; i--) heapify(arr, n, i); Create array and calculate i
  • 28. 28 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Steps to build max heap for heap sort Steps to build max heap for heap sort
  • 29. 29 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Steps to build max heap for heap sort As shown in the above diagram, we start by heapifying the lowest smallest trees and gradually move up until we reach the root element. If you've understood everything till here, congratulations, you are on your way to mastering the Heap sort.
  • 30. 30 G BHARATHKUMAR, AssistantProfessor,CSEDepartment How Heap Sort Works? 1. Since the tree satisfies Max-Heap property, then the largest item is stored at the root node. 2. Swap: Remove the root element and put at the end of the array (nth position) Put the last item of the tree (heap) at the vacant place. 3. Remove: Reduce the size of the heap by 1. 4. Heapify: Heapify the root element again so that we have the highest element at root. 5. The process is repeated until all the items of the list are sorted.
  • 32. 32 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Remove, and Heapify The code below shows the operation. // Heap sort for (int i = n - 1; i >= 0; i--) { swap(&arr[0], &arr[i]); // Heapify root element to get highest element at root again heapify(arr, i, 0); } Heap Sort Complexity Heap Sort has O(nlog n) time complexities for all the cases ( best case, average case, and worst case). Let us understand the reason why. The height of a complete binary tree containing n elements is log n 6. Merge Sort Merge sort is a sorting technique based on divide and conquer technique. With worst- case time complexity being Ο(n log n), it is one of the most respected algorithms. Merge sort first divides the array into equal halves and then combines them in a sorted manner. How Merge Sort Works? To understand merge sort, we take an unsorted array as the following − We know that merge sort first divides the whole array iteratively into equal halves unless the atomic values are achieved. We see here that an array of 8 items is divided into two arrays of size 4. This does not change the sequence of appearance of items in the original. Now we divide these two arrays into halves.
  • 33. 33 G BHARATHKUMAR, AssistantProfessor,CSEDepartment We further divide these arrays and we achieve atomic value which can no more be divided. Now, we combine them in exactly the same manner as they were broken down. Please note the color codes given to these lists. We first compare the element for each list and then combine them into another list in a sorted manner. We see that 14 and 33 are in sorted positions. We compare 27 and 10 and in the target list of 2 values we put 10 first, followed by 27. We change the order of 19 and 35 whereas 42 and 44 are placed sequentially. In the next iteration of the combining phase, we compare lists of two data values, and merge them into a list of found data values placing all in a sorted order. After the final merging, the list should look like this − Now we should learn some programming aspects of merge sorting. Algorithm Merge sort keeps on dividing the list into equal halves until it can no more be divided. By definition, if it is only one element in the list, it is sorted. Then, merge sort combines the smaller sorted lists keeping the new list sorted too. Step 1 − if it is only one element in the list it is already sorted, return. Step 2 − divide the list recursively into two halves until it can no more be divided. Step 3 − merge the smaller lists into new list in sorted order. Pseudocode MergeSort(arr[], l, r) If r > l 1. Find the middle point to divide the array into two halves: middle m = l+ (r-l)/2 2. Call mergeSort for first half: Call mergeSort(arr, l, m)
  • 34. 34 G BHARATHKUMAR, AssistantProfessor,CSEDepartment 3. Call mergeSort for second half: Call mergeSort(arr, m+1, r) 4. Merge the two halves sorted in step 2 and 3: Call merge(arr, l, m, r) void mergeSort(int a[], int start, int end) { int mid; if(start < end) { mid = (start + end) / 2; mergeSort(a, start, mid); mergeSort(a, mid+1, end); merge(a, start, mid, end); } }
  • 35. 35 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Source Code: #include <stdio.h> // lets take a[5] = {32, 45, 67, 2, 7} as the array to be sorted. // merge sort function void mergeSort(int a[], int start, int end) { int mid; if(start < end) { mid = (start + end) / 2; mergeSort(a, start, mid); mergeSort(a, mid+1, end); merge(a, start, mid, end); } } // function to merge the subarrays void merge(int a[], int start, int mid, int end) { int b[100]; //same size of a[] int i, j, k; k = 0; i = start; j = mid + 1; while(i <= mid && j <= end) { if(a[i] < a[j]) { b[k++] = a[i++]; // same as b[k]=a[i]; k++; i++; } else { b[k++] = a[j++]; } } while(i <= mid) {
  • 36. 36 G BHARATHKUMAR, AssistantProfessor,CSEDepartment b[k++] = a[i++]; } while(j <= end) { b[k++] = a[j++]; } for(i=end; i >= start; i--) { a[i] = b[--k]; // copying back the sorted list to a[] } } // function to print the array void printArray(int a[], int size) { int i; for (i=0; i < size; i++) { printf("%d ", a[i]); } printf("n"); } int main() { int n,i,j,a[100],t; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); // calling merge sort mergeSort(a, 0, n - 1); printf("nSorted array: n"); printArray(a, n); return 0; }
  • 37. 37 G BHARATHKUMAR, AssistantProfessor,CSEDepartment The merge function works as follows: 1. Create copies of the subarrays L ← A[p..q] and M ← A[q+1..r]. 2. Create three pointers i, j and k a. i maintains current index of L, starting at 1 b. j maintains current index of M, starting at 1 c. k maintains the current index of A[p..q], starting at p. 3. Until we reach the end of either L or M, pick the larger among the elements from L and M and place them in the correct position at A[p..q] 4. When we run out of elements in either L or M, pick up the remaining elements and put in A[p..q] Important Characteristics of Merge Sort:  Merge Sort is useful for sorting linked lists.  Merge Sort is a stable sort which means that the same element in an array maintain their original positions with respect to each other.  Overall time complexity of Merge sort is O(nLogn). It is more efficient as it is in worst case also the runtime is O(nlogn)  The space complexity of Merge sort is O(n). This means that this algorithm takes a lot of space and may slower down operations for the last data sets. Recurrence Relation - Merge Sort T(n) = 2T(n/2) + n
  • 38. 38 G BHARATHKUMAR, AssistantProfessor,CSEDepartment 7. Quick Sort Like Merge Sort, QuickSort is a Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways. 1. Always pick first element as pivot. 2. Always pick last element as pivot (implemented below) 3. Pick a random element as pivot. 4. Pick median as pivot. The key process in quickSort is partition(). Target of partitions is, given an array and an element x of array as pivot, put x at its correct position in sorted array and put all smaller elements (smaller than x) before x, and put all greater elements (greater than x) after x. All this should be done in linear time. Quick Sort Algorithm Using pivot algorithm recursively, we end up with smaller possible partitions. Each partition is then processed for quick sort. We define recursive algorithm for quicksort as follows − Step 1 − Make the right-most index value pivot Step 2 − partition the array using pivot value Step 3 − quicksort left partition recursively Step 4 − quicksort right partition recursively Quick Sort Pseudocode To get more into it, let see the pseudocode for quick sort algorithm − procedure quickSort(left, right) if right-left <= 0 return else
  • 39. 39 G BHARATHKUMAR, AssistantProfessor,CSEDepartment pivot = A[right] partition = partitionFunc(left, right, pivot) quickSort(left,partition-1) quickSort(partition+1,right) end if end procedure Source Code: #include<stdio.h> #include<conio.h> int a[10]; void main() { int a[]; int i,n,l,u; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); qsort(a,0,n-1); printf("Elements After QuickSort:n"); for(i=0;i<n;i++) printf("%d",a[i]); getch(); } qsort(int a[],int l,int u) { int i,j,temp,k; i=l+1;
  • 40. 40 G BHARATHKUMAR, AssistantProfessor,CSEDepartment j=u; k=l; if(l<u) { while(i<=j) { while(a[i]<a[k]) i++; while(a[j]>a[k]) j--; if(i<j) { temp=a[i]; a[i]=a[j]; a[j]=temp; } } temp=a[j]; a[j]=a[k]; //pivot and j element are swapped a[k]=temp; qsort(a,l,j-1); qsort(a,j+1,u); } } Source Code: #include<stdio.h> #include<conio.h> int a[10];
  • 41. 41 G BHARATHKUMAR, AssistantProfessor,CSEDepartment void swap(int* a, int* b) { int t = *a; *a = *b; *b = t; } void main() { int a[100]; int i,n,l,u; printf("Enter number of elements: "); - scanf("%d",&n); printf("Enter the elements into an array: "); for(i=0;i<n;i++) scanf("%d",&a[i]); qsort(a,0,n-1); printf("Elements After QuickSort:n"); for(i=0;i<n;i++) printf("%d",a[i]); getch(); } qsort(int a[],int l,int u) { int i,j,temp,k; i=l+1; j=u; k=l; if(l<u) { while(i<=j) { while(a[i]<a[k]) i++; while(a[j]>a[k]) j--; if(i<j) { swap(&a[i],&a[j]); } } swap(&a[j],&a[k]); //pivot and j element are swapped qsort(a,l,j-1);
  • 42. 42 G BHARATHKUMAR, AssistantProfessor,CSEDepartment qsort(a,j+1,u); } } How QuickSort Works? 1. A pivot element is chosen from the array. You can choose any element from the array as the pivot element. Here, we have taken the rightmost (ie. the last element) of the array as the pivot element. Select a pivot element 2. The elements smaller than the pivot element are put on the left and the elements greater than the pivot element are put on the right. Put all the smaller elements on the left and greater on the right of pivot element The above arrangement is achieved by the following steps. a. A pointer is fixed at the pivot element. The pivot element is compared with the elements beginning from the first index. If the element greater than the pivot element is reached, a second pointer is set for that element. b. Now, the pivot element is compared with the other elements (a third pointer). If an element smaller than the pivot element is reached, the smaller element is swapped with the greater element found earlier.
  • 43. 43 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Comparison of pivot element with other elements c. The process goes on until the second last element is reached. Finally, the pivot element is swapped with the second pointer.
  • 44. 44 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Swap pivot element with the second pointer d. Now the left and right subparts of this pivot element are taken for further processing in the steps below. 3. Pivot elements are again chosen for the left and the right sub-parts separately. Within these sub-parts, the pivot elements are placed at their right position. Then, step 2 is repeated.
  • 45. 45 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Select pivot element of in each half and put at correct place using recursion 4. The sub-parts are again divided into smaller sub-parts until each subpart is formed of a single element. 5. At this point, the array is already sorted. Quicksort uses recursion for sorting the sub-parts. On the basis of Divide and conquer approach, quicksort algorithm can be explained as:  Divide The array is divided into subparts taking pivot as the partitioning point. The elements smaller than the pivot are placed to the left of the pivot and the elements greater than the pivot are placed to the right.  Conquer The left and the right subparts are again partitioned using the by selecting pivot elements for them. This can be achieved by recursively passing the subparts into the algorithm.  Combine This step does not play a significant role in quicksort. The array is already sorted at the end of the conquer step.
  • 46. 46 G BHARATHKUMAR, AssistantProfessor,CSEDepartment You can understand the working of quicksort with the help of the illustrations below. Sorting the elements on the left of pivot using recursion Recurrence Relation Assume we constructed a quicksort and the pivot value takes linear time. Find the recurrence for worst-case running time. My answer: T(n)= T(n-1) + T(1) + theta(n) Worst case occurs when the subarrays are completely unbalanced. There is 1 element in one subarray and (n-1) elements in the other subarray. theta(n) because it takes running time n to find the pivot. Best case The recurrence relation for quicksort is: T(n)=2T(n/2)+O(n)
  • 47. 47 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Merge Two Arrays in C #include<stdio.h> #include<conio.h> int main() { int arr1[50], arr2[50], size1, size2, i, k, merge[100]; printf("Enter Array 1 Size: "); scanf("%d", &size1); printf("Enter Array 1 Elements: "); for(i=0; i<size1; i++) { scanf("%d", &arr1[i]); merge[i] = arr1[i]; } k = i; printf("nEnter Array 2 Size: "); scanf("%d", &size2); printf("Enter Array 2 Elements: "); for(i=0; i<size2; i++) { scanf("%d", &arr2[i]); merge[k] = arr2[i]; k++; } printf("nThe new array after merging is:n"); for(i=0; i<k; i++) printf("%d ", merge[i]); getch(); return 0; }
  • 48. 48 G BHARATHKUMAR, AssistantProfessor,CSEDepartment Radix Sort: #include<stdio.h> int get_max (int a[], int n){ int i, max = a[0]; for (i = 1; i < n; i++) if (a[i] > max) max = a[i]; return max; } void radix_sort (int a[], int n){ int bucket[10][10], bucket_cnt[10]; int i, j, k, r, NOP = 0, divisor = 1, lar, pass; lar = get_max (a, n); while (lar > 0){ NOP++; lar /= 10; } for (pass = 0; pass < NOP; pass++){ for (i = 0; i < 10; i++){ bucket_cnt[i] = 0; } for (i = 0; i < n; i++){ r = (a[i] / divisor) % 10; bucket[r][bucket_cnt[r]] = a[i]; bucket_cnt[r] += 1; } i = 0; for (k = 0; k < 10; k++){ for (j = 0; j < bucket_cnt[k]; j++){
  • 49. 49 G BHARATHKUMAR, AssistantProfessor,CSEDepartment a[i] = bucket[k][j]; i++; } } divisor *= 10; printf ("After pass %d : ", pass + 1); for (i = 0; i < n; i++) printf ("%d ", a[i]); printf ("n"); } } int main (){ int i, n, a[10]; printf ("Enter the number of items to be sorted: "); scanf ("%d", &n); printf ("Enter items: "); for (i = 0; i < n; i++){ scanf ("%d", &a[i]); } radix_sort (a, n); printf ("Sorted items : "); for (i = 0; i < n; i++) printf ("%d ", a[i]); printf ("n"); return 0; }