This document summarizes key aspects of sorting algorithms discussed in a lecture on sorting, including:
- Common sorting algorithms like bogo sort, selection sort, insertion sort, and merge sort.
- The process and pseudocode for selection sort and merge sort.
- The time complexity of selection sort and merge sort is O(n^2) and O(n log n) respectively.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into subarrays of 1 element, sorts those, and then merges those sorted subarrays back together until the entire array is sorted.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Decimal Long Double Double Double. Represents double-precision floating-point...Anwar Patel
Decimal
Long
Double
Double
Double. Represents double-precision floating-point numbers. It can store decimal values and provides a wider range than the Integer data type. Dim price As ...
The document discusses various algorithms like priority queues, heaps, heap sort, merge sort, quick sort, binary search, and algorithms for finding the maximum and minimum elements in an array. It provides definitions and explanations of these algorithms along with code snippets and examples. Key steps of algorithms like heap sort, merge sort, and quick sort are outlined. Methods for implementing priority queues, binary search and finding max/min in optimal comparisons are also presented.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into subarrays of 1 element, sorts those, and then merges those sorted subarrays back together until the entire array is sorted.
Merge sort has a runtime of O(n log n). It works by recursively dividing an array into halves and then merging the sorted halves. First it divides the array into single elements, sorts those, then merges them back together piece by piece into a fully sorted array.
Decimal Long Double Double Double. Represents double-precision floating-point...Anwar Patel
Decimal
Long
Double
Double
Double. Represents double-precision floating-point numbers. It can store decimal values and provides a wider range than the Integer data type. Dim price As ...
The document discusses various algorithms like priority queues, heaps, heap sort, merge sort, quick sort, binary search, and algorithms for finding the maximum and minimum elements in an array. It provides definitions and explanations of these algorithms along with code snippets and examples. Key steps of algorithms like heap sort, merge sort, and quick sort are outlined. Methods for implementing priority queues, binary search and finding max/min in optimal comparisons are also presented.
The document discusses various searching and sorting algorithms. It describes linear search, binary search, selection sort, bubble sort, and heapsort. For each algorithm, it provides pseudocode examples and analyzes their performance in terms of number of comparisons required in the worst case. Linear search requires N comparisons in the worst case, while binary search requires log N comparisons. Selection sort and bubble sort both require approximately N^2 comparisons, while heapsort requires 1.5NlogN comparisons.
This document provides information about sorting algorithms. It begins with an introduction to sorting, explaining why sorting is important and common sorting algorithms like merge sort, quicksort, heapsort, etc. It then discusses different types of sorting algorithms like comparison-based sorting and specialized sorting. The document proceeds to explain several specific sorting algorithms in detail, including bubble sort, selection sort, insertion sort, merge sort, and bitonic sort. It provides pseudocode for the algorithms and examples to illustrate how they work on sample data. The key details covered are the time complexity of common sorting algorithms and how different algorithms have tradeoffs in terms of speed and memory usage.
The document discusses various sorting algorithms including exchange sorts like bubble sort and quicksort, selection sorts like straight selection sort, and tree sorts like heap sort. For each algorithm, it provides an overview of the approach, pseudocode, analysis of time complexity, and examples. Key algorithms covered are bubble sort (O(n2)), quicksort (average O(n log n)), selection sort (O(n2)), and heap sort (O(n log n)).
This slide explains three (3) basic sorting algorithms with codes on github. Bubble sort, Selection sort and insertion sort.
visit https://github.com/EngrMikolo/BasicSortingAlgorithms to checkout the codes
Quick sort is a highly efficient sorting algorithm that partitions an array into two arrays, one with values less than a specified pivot value and one with values greater than the pivot. It then calls itself recursively to sort the subarrays. The algorithm's average and worst-case complexity is O(n log n). It works by choosing a pivot element, partitioning the array around the pivot so that all elements with values less than the pivot come before elements with greater values, and then applying the same approach recursively to the subarrays until the entire array is sorted.
This document discusses different sorting algorithms. It defines sorting as rearranging elements in a list or array based on a comparison operator. Three sorting algorithms are described: selection sort works by selecting the minimum element and placing it at the front of the sorted subarray; insertion sort works by inserting elements into the sorted position one by one; and bubble sort works by repeatedly swapping adjacent elements if they are in the wrong order. Examples are provided to illustrate how each algorithm sorts an array.
This document discusses various sorting and searching algorithms. It begins by listing sorting algorithms like selection sort, insertion sort, bubble sort, merge sort, and radix sort. It then discusses searching algorithms like linear/sequential search and binary search. It provides details on the implementation and time complexity of linear search, binary search, bubble sort, insertion sort, selection sort, and merge sort. Key points covered include the divide and conquer approach of merge sort and how binary search recursively halves the search space.
1. The document discusses various data structures concepts including arrays, dynamic arrays, operations on arrays like traversing, insertion, deletion, sorting, and searching.
2. It provides examples of declaring and initializing arrays, as well as dynamic array allocation using pointers and new/delete operators.
3. Searching techniques like linear search and binary search are explained, with linear search comparing each element sequentially while binary search eliminates half the elements at each step for sorted arrays.
This document provides an overview of several advanced sorting algorithms: Shell sort, Quick sort, Heap sort, and Merge sort. It describes the key ideas, time complexities, and provides examples of implementing each algorithm to sort sample data sets. Shell sort improves on insertion sort by sorting elements in a two-dimensional array. Quick sort uses a pivot element and partitions elements into left and right subsets. Heap sort uses a heap data structure and sorts by swapping elements. Merge sort divides the list recursively and then merges the sorted halves.
Chapter 4: basic search algorithms data structureMahmoud Alfarra
1) The document discusses two common search algorithms: sequential search and binary search. Sequential search looks at each item in a list sequentially until the target is found. Binary search works on a sorted list and divides the search space in half at each step.
2) It provides pseudocode examples of how each algorithm works step-by-step to find a target value in a list or array.
3) Binary search is more efficient than sequential search when the list is sorted, as it can significantly reduce the number of comparisons needed to find the target. Sequential search is used when the list is unsorted.
Problem 1 Show the comparison of runtime of linear search and binar.pdfebrahimbadushata00
The document describes two problems:
1) Comparing the runtime of linear search and binary search on random data sets of increasing sizes from 50,000 to 300,000 elements. The worst case runtime is reported.
2) Comparing the runtime of bubble sort and merge sort on the same random data sets. The algorithms sort the data in ascending order.
Java code is provided to generate the random data, implement the algorithms, and output the runtimes in nanoseconds. Line charts and tables are to be created from the output data to compare the performance of the different algorithms.
The document discusses sorting algorithms including bubble sort, selection sort, insertion sort, and merge sort. It provides pseudocode and explanations of how each algorithm works. Bubble sort, selection sort, and insertion sort have O(n2) runtime and are best for small datasets, while merge sort uses a divide-and-conquer approach to sort arrays with O(n log n) runtime, making it more efficient for large datasets. Radix sort is also discussed as an alternative sorting method that is optimized for certain data types.
The document discusses various searching, sorting, and hashing techniques used in data structures and algorithms. It describes linear and binary search methods for finding elements in a list or array. It also explains bubble, insertion, selection, and shell sort algorithms for arranging elements in ascending or descending order. Finally, it covers hashing techniques like hash functions, separate chaining, and open addressing that are used to map keys to values in a hash table.
The document provides an overview of several sorting algorithms, including insertion sort, bubble sort, selection sort, and radix sort. It describes the basic approach for each algorithm through examples and pseudocode. Analysis of the time complexity is also provided, with insertion sort, bubble sort, and selection sort having worst-case performance of O(n^2) and radix sort having performance of O(nk) where k is the number of passes.
This document provides an overview of arrays in Java, including how to declare, initialize, access, and manipulate array elements. It discusses key array concepts like indexes, the length field, and for loops for traversing arrays. Examples are provided for common array operations like initialization, accessing elements, and passing arrays as parameters or returning them from methods. Limitations of arrays are also covered.
The document discusses algorithms and their analysis. It begins by defining an algorithm and key aspects like correctness, input, and output. It then discusses two aspects of algorithm performance - time and space. Examples are provided to illustrate how to analyze the time complexity of different structures like if/else statements, simple loops, and nested loops. Big O notation is introduced to describe an algorithm's growth rate. Common time complexities like constant, linear, quadratic, and cubic functions are defined. Specific sorting algorithms like insertion sort, selection sort, bubble sort, merge sort, and quicksort are then covered in detail with examples of how they work and their time complexities.
The document discusses arrays in Java. It begins by defining an array as a data structure that holds a collection of the same type of data. It then covers topics such as declaring and creating arrays, accessing array elements using indexes, default values, passing arrays to methods, returning arrays from methods, and two-dimensional arrays. Examples are provided throughout to illustrate key concepts related to working with arrays in Java programs.
IRJET- A Survey on Different Searching AlgorithmsIRJET Journal
The document summarizes and compares several common search algorithms:
- Binary search has the best average time complexity of O(log n) but only works on sorted data. Linear search has average time complexity of O(n) and works on any data but is less efficient.
- Hybrid search combines linear and binary search to search unsorted arrays more efficiently than linear search. Interpolation search is an improvement on binary search that may search in different locations based on the search key value.
- Jump search works on sorted data by jumping in blocks of size sqrt(n) and doing a linear search within blocks. It has better average performance than linear search but only works on sorted data.
Leadership Ambassador club Adventist modulekakomaeric00
Aims to equip people who aspire to become leaders with good qualities,and with Christian values and morals as per Biblical teachings.The you who aspire to be leaders should first read and understand what the ambassador module for leadership says about leadership and marry that to what the bible says.Christians sh
The document discusses various searching and sorting algorithms. It describes linear search, binary search, selection sort, bubble sort, and heapsort. For each algorithm, it provides pseudocode examples and analyzes their performance in terms of number of comparisons required in the worst case. Linear search requires N comparisons in the worst case, while binary search requires log N comparisons. Selection sort and bubble sort both require approximately N^2 comparisons, while heapsort requires 1.5NlogN comparisons.
This document provides information about sorting algorithms. It begins with an introduction to sorting, explaining why sorting is important and common sorting algorithms like merge sort, quicksort, heapsort, etc. It then discusses different types of sorting algorithms like comparison-based sorting and specialized sorting. The document proceeds to explain several specific sorting algorithms in detail, including bubble sort, selection sort, insertion sort, merge sort, and bitonic sort. It provides pseudocode for the algorithms and examples to illustrate how they work on sample data. The key details covered are the time complexity of common sorting algorithms and how different algorithms have tradeoffs in terms of speed and memory usage.
The document discusses various sorting algorithms including exchange sorts like bubble sort and quicksort, selection sorts like straight selection sort, and tree sorts like heap sort. For each algorithm, it provides an overview of the approach, pseudocode, analysis of time complexity, and examples. Key algorithms covered are bubble sort (O(n2)), quicksort (average O(n log n)), selection sort (O(n2)), and heap sort (O(n log n)).
This slide explains three (3) basic sorting algorithms with codes on github. Bubble sort, Selection sort and insertion sort.
visit https://github.com/EngrMikolo/BasicSortingAlgorithms to checkout the codes
Quick sort is a highly efficient sorting algorithm that partitions an array into two arrays, one with values less than a specified pivot value and one with values greater than the pivot. It then calls itself recursively to sort the subarrays. The algorithm's average and worst-case complexity is O(n log n). It works by choosing a pivot element, partitioning the array around the pivot so that all elements with values less than the pivot come before elements with greater values, and then applying the same approach recursively to the subarrays until the entire array is sorted.
This document discusses different sorting algorithms. It defines sorting as rearranging elements in a list or array based on a comparison operator. Three sorting algorithms are described: selection sort works by selecting the minimum element and placing it at the front of the sorted subarray; insertion sort works by inserting elements into the sorted position one by one; and bubble sort works by repeatedly swapping adjacent elements if they are in the wrong order. Examples are provided to illustrate how each algorithm sorts an array.
This document discusses various sorting and searching algorithms. It begins by listing sorting algorithms like selection sort, insertion sort, bubble sort, merge sort, and radix sort. It then discusses searching algorithms like linear/sequential search and binary search. It provides details on the implementation and time complexity of linear search, binary search, bubble sort, insertion sort, selection sort, and merge sort. Key points covered include the divide and conquer approach of merge sort and how binary search recursively halves the search space.
1. The document discusses various data structures concepts including arrays, dynamic arrays, operations on arrays like traversing, insertion, deletion, sorting, and searching.
2. It provides examples of declaring and initializing arrays, as well as dynamic array allocation using pointers and new/delete operators.
3. Searching techniques like linear search and binary search are explained, with linear search comparing each element sequentially while binary search eliminates half the elements at each step for sorted arrays.
This document provides an overview of several advanced sorting algorithms: Shell sort, Quick sort, Heap sort, and Merge sort. It describes the key ideas, time complexities, and provides examples of implementing each algorithm to sort sample data sets. Shell sort improves on insertion sort by sorting elements in a two-dimensional array. Quick sort uses a pivot element and partitions elements into left and right subsets. Heap sort uses a heap data structure and sorts by swapping elements. Merge sort divides the list recursively and then merges the sorted halves.
Chapter 4: basic search algorithms data structureMahmoud Alfarra
1) The document discusses two common search algorithms: sequential search and binary search. Sequential search looks at each item in a list sequentially until the target is found. Binary search works on a sorted list and divides the search space in half at each step.
2) It provides pseudocode examples of how each algorithm works step-by-step to find a target value in a list or array.
3) Binary search is more efficient than sequential search when the list is sorted, as it can significantly reduce the number of comparisons needed to find the target. Sequential search is used when the list is unsorted.
Problem 1 Show the comparison of runtime of linear search and binar.pdfebrahimbadushata00
The document describes two problems:
1) Comparing the runtime of linear search and binary search on random data sets of increasing sizes from 50,000 to 300,000 elements. The worst case runtime is reported.
2) Comparing the runtime of bubble sort and merge sort on the same random data sets. The algorithms sort the data in ascending order.
Java code is provided to generate the random data, implement the algorithms, and output the runtimes in nanoseconds. Line charts and tables are to be created from the output data to compare the performance of the different algorithms.
The document discusses sorting algorithms including bubble sort, selection sort, insertion sort, and merge sort. It provides pseudocode and explanations of how each algorithm works. Bubble sort, selection sort, and insertion sort have O(n2) runtime and are best for small datasets, while merge sort uses a divide-and-conquer approach to sort arrays with O(n log n) runtime, making it more efficient for large datasets. Radix sort is also discussed as an alternative sorting method that is optimized for certain data types.
The document discusses various searching, sorting, and hashing techniques used in data structures and algorithms. It describes linear and binary search methods for finding elements in a list or array. It also explains bubble, insertion, selection, and shell sort algorithms for arranging elements in ascending or descending order. Finally, it covers hashing techniques like hash functions, separate chaining, and open addressing that are used to map keys to values in a hash table.
The document provides an overview of several sorting algorithms, including insertion sort, bubble sort, selection sort, and radix sort. It describes the basic approach for each algorithm through examples and pseudocode. Analysis of the time complexity is also provided, with insertion sort, bubble sort, and selection sort having worst-case performance of O(n^2) and radix sort having performance of O(nk) where k is the number of passes.
This document provides an overview of arrays in Java, including how to declare, initialize, access, and manipulate array elements. It discusses key array concepts like indexes, the length field, and for loops for traversing arrays. Examples are provided for common array operations like initialization, accessing elements, and passing arrays as parameters or returning them from methods. Limitations of arrays are also covered.
The document discusses algorithms and their analysis. It begins by defining an algorithm and key aspects like correctness, input, and output. It then discusses two aspects of algorithm performance - time and space. Examples are provided to illustrate how to analyze the time complexity of different structures like if/else statements, simple loops, and nested loops. Big O notation is introduced to describe an algorithm's growth rate. Common time complexities like constant, linear, quadratic, and cubic functions are defined. Specific sorting algorithms like insertion sort, selection sort, bubble sort, merge sort, and quicksort are then covered in detail with examples of how they work and their time complexities.
The document discusses arrays in Java. It begins by defining an array as a data structure that holds a collection of the same type of data. It then covers topics such as declaring and creating arrays, accessing array elements using indexes, default values, passing arrays to methods, returning arrays from methods, and two-dimensional arrays. Examples are provided throughout to illustrate key concepts related to working with arrays in Java programs.
IRJET- A Survey on Different Searching AlgorithmsIRJET Journal
The document summarizes and compares several common search algorithms:
- Binary search has the best average time complexity of O(log n) but only works on sorted data. Linear search has average time complexity of O(n) and works on any data but is less efficient.
- Hybrid search combines linear and binary search to search unsorted arrays more efficiently than linear search. Interpolation search is an improvement on binary search that may search in different locations based on the search key value.
- Jump search works on sorted data by jumping in blocks of size sqrt(n) and doing a linear search within blocks. It has better average performance than linear search but only works on sorted data.
Leadership Ambassador club Adventist modulekakomaeric00
Aims to equip people who aspire to become leaders with good qualities,and with Christian values and morals as per Biblical teachings.The you who aspire to be leaders should first read and understand what the ambassador module for leadership says about leadership and marry that to what the bible says.Christians sh
Resumes, Cover Letters, and Applying OnlineBruce Bennett
This webinar showcases resume styles and the elements that go into building your resume. Every job application requires unique skills, and this session will show you how to improve your resume to match the jobs to which you are applying. Additionally, we will discuss cover letters and learn about ideas to include. Every job application requires unique skills so learn ways to give you the best chance of success when applying for a new position. Learn how to take advantage of all the features when uploading a job application to a company’s applicant tracking system.
A Guide to a Winning Interview June 2024Bruce Bennett
This webinar is an in-depth review of the interview process. Preparation is a key element to acing an interview. Learn the best approaches from the initial phone screen to the face-to-face meeting with the hiring manager. You will hear great answers to several standard questions, including the dreaded “Tell Me About Yourself”.
Jill Pizzola's Tenure as Senior Talent Acquisition Partner at THOMSON REUTERS...dsnow9802
Jill Pizzola's tenure as Senior Talent Acquisition Partner at THOMSON REUTERS in Marlton, New Jersey, from 2018 to 2023, was marked by innovation and excellence.
2. 2
Sorting
• sorting: Rearranging the values in an array or collection into a
specific order (usually into their "natural ordering").
– one of the fundamental problems in computer science
– can be solved in many ways:
• there are many sorting algorithms
• some are faster/slower than others
• some use more/less memory than others
• some work better with specific kinds of data
• some can utilize multiple computers / processors, ...
– comparison-based sorting : determining order by comparing pairs
of elements:
•<, >, compareTo, …
3. 3
Comparable and sorting
• The Arrays and Collections classes in java.util have a
static method sort that sorts the elements of an array/list
Point[] points = new Point[3];
points[0] = new Point(7, 6);
points[1] = new Point(10, 2)
points[2] = new Point(7, -1);
points[3] = new Point(3, 11);
Arrays.sort(points);
System.out.println(Arrays.toString(points));
// [(3, 11), (7, -1), (7, 6), (10, 2)]
List<Point> points = new ArrayList<Point>();
points.add(...);
Collections.sort(points);
System.out.println(points);
// [(3, 11), (7, -1), (7, 6), (10, 2)]
4. 4
Sorting algorithms
• bogo sort: shuffle and pray
• bubble sort: swap adjacent pairs that are out of order
• selection sort: look for the smallest element, move to front
• insertion sort: build an increasingly large sorted front portion
• merge sort: recursively divide the array in half and sort it
• heap sort: place the values into a sorted tree structure
• quick sort: recursively partition array based on a middle value
other specialized sorting algorithms:
• bucket sort: cluster elements into smaller groups, sort them
• radix sort: sort integers by last digit, then 2nd to last, then ...
• ...
5. 5
Bogo sort
• bogo sort: Orders a list of values by repetitively shuffling
them and checking if they are sorted.
– name comes from the word "bogus"
The algorithm:
– Scan the list, seeing if it is sorted. If so, stop.
– Else, shuffle the values in the list and repeat.
• This sorting algorithm (obviously) has terrible performance!
– What is its runtime?
6. 6
Bogo sort code
// Places the elements of a into sorted order.
public static void bogoSort(int[] a) {
while (!isSorted(a)) {
shuffle(a);
}
}
// Returns true if a's elements are in sorted order.
public static boolean isSorted(int[] a) {
for (int i = 0; i < a.length - 1; i++) {
if (a[i] > a[i + 1]) {
return false;
}
}
return true;
}
7. 7
Bogo sort code, cont'd.
// Shuffles an array of ints by randomly swapping each
// element with an element ahead of it in the array.
public static void shuffle(int[] a) {
for (int i = 0; i < a.length - 1; i++) {
// pick a random index in [i+1, a.length-1]
int range = a.length - 1 - (i + 1) + 1;
int j = (int) (Math.random() * range + (i + 1));
swap(a, i, j);
}
}
// Swaps a[i] with a[j].
public static void swap(int[] a, int i, int j) {
if (i != j) {
int temp = a[i];
a[i] = a[j];
a[j] = temp;
}
}
8. 8
Selection sort
• selection sort: Orders a list of values by repeatedly putting
the smallest or largest unplaced value into its final position.
The algorithm:
– Look through the list to find the smallest value.
– Swap it so that it is at index 0.
– Look through the list to find the second-smallest value.
– Swap it so that it is at index 1.
...
– Repeat until all values are in their proper places.
10. 10
Selection sort code
// Rearranges the elements of a into sorted order using
// the selection sort algorithm.
public static void selectionSort(int[] a) {
for (int i = 0; i < a.length - 1; i++) {
// find index of smallest remaining value
int min = i;
for (int j = i + 1; j < a.length; j++) {
if (a[j] < a[min]) {
min = j;
}
}
// swap smallest value its proper place, a[i]
swap(a, i, min);
}
}
12. 12
Similar algorithms
• bubble sort: Make repeated passes, swapping adjacent values
– slower than selection sort (has to do more swaps)
• insertion sort: Shift each element into a sorted sub-array
– faster than selection sort (examines fewer values)
index 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
value 22 18 12 -4 27 30 36 50 7 68 91 56 2 85 42 98 25
index 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
value 18 12 -4 22 27 30 36 7 50 68 56 2 85 42 91 25 98
22 50 91 98
index 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
value -4 12 18 22 27 30 36 50 7 68 91 56 2 85 42 98 25
7
sorted sub-array (indexes 0-7)
13. 13
Merge sort
• merge sort: Repeatedly divides the data in half, sorts each
half, and combines the sorted halves into a sorted whole.
The algorithm:
– Divide the list into two roughly equal halves.
– Sort the left half.
– Sort the right half.
– Merge the two sorted halves into one sorted list.
– Often implemented recursively.
– An example of a "divide and conquer" algorithm.
• Invented by John von Neumann in 1945
15. 15
Splitting in half
// Returns the first half of the given array.
public static int[] leftHalf(int[] a) {
int size1 = a.length / 2;
int[] left = new int[size1];
for (int i = 0; i < size1; i++) {
left[i] = a[i];
}
return left;
}
// Returns the second half of the given array.
public static int[] rightHalf(int[] a) {
int size1 = a.length / 2;
int size2 = a.length - size1;
int[] right = new int[size2];
for (int i = 0; i < size2; i++) {
right[i] = a[i + size1];
}
return right;
}
17. 17
Merge halves code
// Merges the left/right elements into a sorted result.
// Precondition: left/right are sorted
public static void merge(int[] result, int[] left,
int[] right) {
int i1 = 0; // index into left array
int i2 = 0; // index into right array
for (int i = 0; i < result.length; i++) {
if (i2 >= right.length ||
(i1 < left.length && left[i1] <= right[i2])) {
result[i] = left[i1]; // take from left
i1++;
} else {
result[i] = right[i2]; // take from right
i2++;
}
}
}
18. 18
Merge sort code
// Rearranges the elements of a into sorted order using
// the merge sort algorithm.
public static void mergeSort(int[] a) {
// split array into two halves
int[] left = leftHalf(a);
int[] right = rightHalf(a);
// sort the two halves
...
// merge the sorted halves into a sorted whole
merge(a, left, right);
}
19. 19
Merge sort code 2
// Rearranges the elements of a into sorted order using
// the merge sort algorithm (recursive).
public static void mergeSort(int[] a) {
if (a.length >= 2) {
// split array into two halves
int[] left = leftHalf(a);
int[] right = rightHalf(a);
// sort the two halves
mergeSort(left);
mergeSort(right);
// merge the sorted halves into a sorted whole
merge(a, left, right);
}
}