In the previous articles we had a short overview of the most popular sorting algorithms and their implementations.
But there are many more things we should have a basic understanding of when dealing with this topic.
Let's take a look and learn all of them today.
The algorithm is called in-place when it transforms input almost with almost no additional memory, however, a small constant extra space used for variables is allowed.
The strict definition of in-place algorithms includes all algorithms with
O(1)
space complexity.
The input is usually overwritten by the output because the input sequence is only updated by replacing or swapping elements.
Examples of in-place sorting algorithms:
Bubble Sort - repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order.
The space complexity is O(1)
, because only a single additional memory space is required (to store the temp
variable).
Insertion Sort - iterates, consuming one input element each repetition and growing a sorted output list. At each iteration, insertion sort removes one element from the input data, finds the location it belongs within the sorted list and inserts it there.
The space complexity is O(1)
.
Selection Sort - divides the input array into two sublists - a sorted one, which is built up from left to right and a sublist of the remaining unsorted items, which take up the rest of the list.
First, the sorted sublist is empty and the unsorted sublist contains the entire input array.
An algorithm works by finding the smallest (or the largest, depending on the order) element in the unsorted sublist and swapping it with the leftmost unsorted element, thereby extending the length of the sorted sublist.
This process continues until the entire input array is sorted.
The space complexity is O(1)
.
Heap Sort - divides its input into a sorted and an unsorted region and iteratively shrinks the unsorted region by extracting the largest element from it and inserting it into the sorted region.
Can be considered as an improved Selection Sort: Heap Sort does not waste time with a linear-time scan of the unsorted region; rather, heap sort maintains the unsorted region in a heap data structure to more quickly find the largest element in each step.
The space complexity is O(1)
.
Quick Sort(?) - works by selecting a pivot element from the array and partitioning the other elements into two sub-arrays, according to whether they are less than or greater than the pivot.
The sub-arrays are then sorted recursively.
On the one hand, the space complexity of this algorithm is O(logN)
, which disqualifies it from being an in-place algorithm, since it is greater than O(1)
.
On the other hand, it can be qualified as an in-place algorithm, since the algorithm only exchanges elements within the input data structure.
In summary, qualifying Quick Sort as in-place or not-in-place depends on the definition of in-place algorithms, which can be different in different sources.
The algorithm is called not-in-place if it transforms input using additional memory, which depends on input size.
Examples of not-in-place sorting algorithms:
Merge Sort - works by dividing the unsorted list by n
sublists, each containing one element and repeatedly merging these sublists, producing new sorted sublists until one left.
The space complexity is O(N)
.
Divide and conquer is an algorithm design paradigm based on multi-branched recursion.
Algorithms using this paradigm work by recursively decomposing a problem into two or more sub-problems of the same or related type until they are simple enough to be solved directly.
The solutions for the sub-problems are then combined to find a solution for the original problem.
This divide-and-conquer technique is the basis of efficient algorithms for all kinds of problems, not only for sorting.
Examples of divide-and-conquer sorting algorithms:
The comparison-based algorithm is a kind of sorting algorithm that works only on the input array by comparing pairs of elements and moving elements around based on the results of these comparisons.
Examples of comparison-based sorting algorithms:
A stable sorting algorithm preserves the order of records with equal keys.
The sorting algorithm can be considered stable if two objects with the same keys appear in sorted output in the same order as in the unsorted input.
It is important to use only a stable sorting algorithm if the problem to be solved requires retention of this relative order.
Examples of stable sorting algorithms:
The unstable sorting algorithm does not preserve the order of records with equal keys.
Examples of unstable sorting algorithms:
In the previous sections we have talked a little about space complexity, but do we really understand what does O(...)
mean?
It is called Big O notation.
In computer science, it is used to describe the required execution time or the space used by an algorithm.
The best way to understand this concept is to look at some examples.
const getLastElement = arr => arr[arr.length - 1];
console.log(getLastElement([1, 2, 3])); // Prints "3"
const findValue = (arr, value) => {
for(let i = 0; i < arr.length; i++) {
if(arr[i] === value) {
return 1;
}
}
return -1;
}
console.log(findValue([1, 6, 75], 75)); // Prints "1"
console.log(findValue([1, 6, 75], 2)); // Prints "-1"
const findValue = (arr, value) => {
for(let i = 0; i < arr.length; i++) {
for(let j = 0; j < arr[i].length; j++) {
if(arr[i][j] === value) {
return 1;
}
}
}
return -1;
}
const arr = [
[
1, 2, 3
],
[
4, 5, 6
],
];
console.log(findValue(arr, 6)); // Prints "1"
console.log(findValue(arr, 7)); // Prints "-1"
const binarySearch = function (array, target) {
let guess;
let min = 0;
let max = array.length - 1;
while (min <= max) {
guess = Math.floor((min + max) / 2);
if (array[guess] === target) {
return guess;
} else if (array[guess] < target) {
min = guess + 1;
} else {
max = guess - 1;
}
}
return -1;
}
const merge = (left, right) => {
let result = [];
while(left.length && right.length) {
if(left[0] < right[0]) {
result.push(left.shift());
} else {
result.push(right.shift());
}
}
return result.concat(left.slice().concat(right.slice()));
};
const mergeSort = arr => {
if(arr.length <= 1) {
return arr;
}
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
};
const fibonacci = num => {
if(num <= 1) {
return num;
}
return fibonacci(num - 1) + fibonacci(num - 2);
}
console.log(fibonacci(10)); // Prints "55"
The following image shows the number of operations N versus the input size n:
The complexity of an algorithm is a measure of the time, space, or other resources required for its execution.
Usually, this involves determining a function that relates the length of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity).
An algorithm is considered efficient if the values of this function are small, or grow slowly compared to an increase in the size of the input.
Different inputs of equal length may cause the algorithm to behave differently, so best, worst. and average case descriptions may be of practical interest.
Unless otherwise specified, the function that describes the performance of an algorithm is usually an upper bound, determined from the worst-case inputs to the algorithm.
Let's take a look at the Quick Sort algorithm:
O(NlogN)
, if each time we divide an array into two nearly equal or equal piecesWorst-case time complexity is O(N^2)
, if one of the pieces has a length equal to the size N - 1 and it happens repeatedly in every partition
This can occur if the selected pivot element is the smallest or the largest one on the list, or in some implementations when all elements are equal.
O(NlogN)
In this article we filled in all the gaps that could have arisen when learning different types of search and sorting algorithms.
Once you learn these simple concepts, try to keep your code as performant as possible.
Although it seems to be an obvious concept, many developers still neglect it and it leads to writing legacy code that is slow but just works.