QuickSort¶
Quicksort is often the best practical choice for sorting because it is remarkably efficient on the average: its expected running time is $\Theta(n\text{lg}n)$. and the constant factors hidden in the $\Theta(n\text{lg}n)$ notation are quite small. It also has the advantage of sorting in place and it works well even in virtual-memory environments.
unstable
quicksort is unstable sort algorithm
stable in sorting algorithm
A sorting algorithm is said to be stable if two objects with equal keys appear in the same order in the sorted output as they appear in the unsorted input.
Whereas a sorting algorithm is said to be unstable if there are two or more objects with equal keys which don’t appear in same order before and after sorting.
Timecomplexity¶
Average-Case | Worst-Case |
---|---|
$\Theta(nlgn)$ | $\Omicron(n^2)$ |
Description¶
Quicksort, like merge sort, applies the divide-and-conquer paradigm.
Here is the three-step divide-and-conquer process for sorting a typical subarray $A[p..r]$.
1. Divide¶
Partition (rearrange) the array $A[p..r]$ into two (possibly empty) subarrays $A[p..q-1]$ and $A[q + 1..r]$ such that each element of $A[p.. q - 1]$ is less than or equal to $A[q]$, which is, in turn, less than or equal to each element of $A[q + 1..r]. Compute the index $q$ as part of this partitioning procedure.
2. Conquer¶
Sort the two subarrays $A[p..q - 1]$ and $A[q + 1..r] by recursive calls to quicksort
3. Combine¶
Because the subarrays are already sorted, no work is needed to combine them: the entire array $A[p..r]$ is now sorted
Implementations¶
CLRS version¶
SLOW!
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
|
FAST version¶
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
|