Understanding Sorting Algorithms
Definition
Section titled “Definition”A sorting algorithm is an algorithm that puts elements of a list into an order. The most frequently used orders are numerical order and lexicographical order.
~ Wikipedia
The concept
Section titled “The concept”Sorting algorithms are the bread and butter of computer science. They are fundamental tools that allow us to organize data, each with specific trade-offs between space and time complexity.
We usually look at:
- Time Complexity: How fast it runs (Big O).
- Space Complexity: How much memory it needs.
- Stability: Does it keep the original order of equal items?
Quick Reference
Section titled “Quick Reference”| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble | O(n) | O(n²) | O(n²) | O(1) | ✓ |
| Selection | O(n²) | O(n²) | O(n²) | O(1) | ✗ |
| Insertion | O(n) | O(n²) | O(n²) | O(1) | ✓ |
| Merge | O(n log n) | O(n log n) | O(n log n) | O(n) | ✓ |
| Quick | O(n log n) | O(n log n) | O(n²) | O(log n) | ✗ |
| Heap | O(n log n) | O(n log n) | O(n log n) | O(1) | ✗ |
Key Concepts
Section titled “Key Concepts”Definitions
Section titled “Definitions”- Time Complexity: How the runtime of an algorithm grows as the input size increases. We usually care about the worst-case scenario (Big O notation).
- Space Complexity: The amount of extra memory required by the algorithm to execute.
- Stability: A sorting algorithm is stable if it preserves the relative order of items with equal sort keys.
- In-Place Sorting: An algorithm that transforms the input using no auxiliary data structure.
Time Complexity Classes
Section titled “Time Complexity Classes”- O(1): Constant - same time regardless of input
- O(n): Linear - time grows proportionally to input size
- O(n log n): Log-linear - efficient for large datasets
- O(n²): Quadratic - becomes slow with large datasets
Space Complexity
Section titled “Space Complexity”- O(1): In-place algorithms (Bubble, Selection, Insertion, Heap, Quick)
- O(log n): Quick Sort’s recursion stack
- O(n): Merge Sort requires extra array
Stability
Section titled “Stability”A stable sort preserves the relative order of equal elements. Important for multi-level sorting.
- Stable: Bubble, Insertion, Merge
- Unstable: Selection, Quick, Heap
The Algorithms
Section titled “The Algorithms”Bubble Sort
Section titled “Bubble Sort”Repeatedly steps through the list, compares adjacent elements, and swaps them if in wrong order. Simple but inefficient.
function bubbleSort<T>(arr: T[]): T[] { const result = [...arr]; for (let i = 0; i < result.length; i++) { for (let j = 0; j < result.length - i - 1; j++) { if (result[j] > result[j + 1]) { [result[j], result[j + 1]] = [result[j + 1], result[j]]; } } } return result;}Selection Sort
Section titled “Selection Sort”Finds the minimum element and places it at the beginning. Minimizes swaps.
function selectionSort<T>(arr: T[]): T[] { const result = [...arr]; for (let i = 0; i < result.length; i++) { let minIndex = i; for (let j = i + 1; j < result.length; j++) { if (result[j] < result[minIndex]) minIndex = j; } if (minIndex !== i) { [result[i], result[minIndex]] = [result[minIndex], result[i]]; } } return result;}Insertion Sort
Section titled “Insertion Sort”Builds sorted array one item at a time. Efficient for small or nearly sorted data.
function insertionSort<T>(arr: T[]): T[] { const result = [...arr]; for (let i = 1; i < result.length; i++) { const key = result[i]; let j = i - 1; while (j >= 0 && result[j] > key) { result[j + 1] = result[j]; j--; } result[j + 1] = key; } return result;}Merge Sort
Section titled “Merge Sort”Divides array into halves, recursively sorts them, then merges. Guaranteed O(n log n).
function mergeSort<T>(arr: T[]): T[] { if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2); const left = mergeSort(arr.slice(0, mid)); const right = mergeSort(arr.slice(mid));
return merge(left, right);}
function merge<T>(left: T[], right: T[]): T[] { const result: T[] = []; let i = 0; let j = 0;
while (i < left.length && j < right.length) { if (left[i] <= right[j]) { result.push(left[i]); i++; } else { result.push(right[j]); j++; } }
return result.concat(left.slice(i)).concat(right.slice(j));}Quick Sort
Section titled “Quick Sort”Picks a pivot, partitions array around it, recursively sorts partitions. Fast in practice.
function quickSort<T>(arr: T[]): T[] { const result = [...arr];
// Partition places the pivot in its correct position // All smaller elements to left, larger to right function partition(low: number, high: number): number { const pivot = result[high]; // Choosing last element as pivot let i = low - 1; // Index of smaller element
for (let j = low; j < high; j++) { // If current element is smaller than the pivot if (result[j] < pivot) { i++; [result[i], result[j]] = [result[j], result[i]]; } } // Place pivot after the last smaller element [result[i + 1], result[high]] = [result[high], result[i + 1]]; return i + 1; }
function sort(low: number, high: number) { if (low < high) { const pi = partition(low, high); // Recursively sort elements before and after partition sort(low, pi - 1); sort(pi + 1, high); } }
sort(0, result.length - 1); return result;}Heap Sort
Section titled “Heap Sort”Uses binary heap data structure. Guaranteed O(n log n) with O(1) space.
function heapSort<T>(arr: T[]): T[] { const result = [...arr]; const n = result.length;
// Build max heap: rearrange array so parent > children for (let i = Math.floor(n / 2) - 1; i >= 0; i--) { heapify(result, n, i); }
// Extract elements from heap one by one for (let i = n - 1; i > 0; i--) { // Move current root (largest) to end [result[0], result[i]] = [result[i], result[0]]; // Call max heapify on the reduced heap heapify(result, i, 0); }
return result;}
// Maintains the heap property for a subtree rooted at index ifunction heapify<T>(arr: T[], n: number, i: number): void { let largest = i; const left = 2 * i + 1; const right = 2 * i + 2;
// If left child is larger than root if (left < n && arr[left] > arr[largest]) largest = left; // If right child is larger than largest so far if (right < n && arr[right] > arr[largest]) largest = right;
// If largest is not root if (largest !== i) { [arr[i], arr[largest]] = [arr[largest], arr[i]]; // Recursively heapify the affected sub-tree heapify(arr, n, largest); }}Deep Dive
Section titled “Deep Dive”Why Quick Sort isn’t always O(n log n)
Section titled “Why Quick Sort isn’t always O(n log n)”Quick Sort’s performance depends heavily on the pivot selection.
- Best Case: Pivot divides array into two equal halves.
- Worst Case: Pivot is the smallest or largest element (e.g., sorting an already sorted array with last element as pivot). This creates unbalanced partitions, degrading to O(n²).
NB: To mitigate this, robust implementations often use Randomized Pivot or Median-of-Three (picking the median of first, middle, and last elements) to ensure balanced partitions.
Space Complexity: Merge vs. Quick
Section titled “Space Complexity: Merge vs. Quick”- Merge Sort is “out-of-place”. It requires O(n) auxiliary space to store the sub-arrays during the merge phase. This can be costly for very large datasets in memory-constrained environments.
- Quick Sort is “in-place”. It doesn’t create new arrays but uses the call stack for recursion. The space complexity is O(log n) for the stack frames, which is generally much lighter than Merge Sort’s requirements.
JavaScript’s Array.sort()
Section titled “JavaScript’s Array.sort()”JavaScript engines use sophisticated algorithms (usually Timsort):
// Without comparator - lexicographic string sortingconst numbers = [10, 5, 40, 25, 1000, 1];numbers.sort(); // [1, 10, 1000, 25, 40, 5] ⚠️
// With comparator - numeric sortingnumbers.sort((a, b) => a - b); // [1, 5, 10, 25, 40, 1000] ✓