Design & Analysis of Algorithms
(MTCS102)
UNIT-1
Quick Sort Algorithm
From:
Asif Khan
Department of CSE
GNIOT, Greater Noida
Reference:
Thomas H. Corman et al.
The MIT Press
Quick Sort
The quicksort algorithm has a worst-case running time of
θ(n2) on an input array of n numbers. Despite this slow
worst-case running time, quicksort is often the best
practical choice for sorting because it is remarkably
efficient on the average: its expected running time is θ(n
lg n) and the constant factors hidden in the θ(n lg n)
notation are quite small. It also has the advantage of
sorting in place, and it works well even in virtual-memory
environments.
Quick Sort
Quicksort, like merge sort, applies the divide-and-conquer
paradigm. Here are the three-step of divide-and-conquer
process for sorting a typical subarray A(p .. r):
Divide: Partition (rearrange) the array A(p .. r) into two
(possibly empty) subarrays A(p..q-1) and A(q+1..r) such
that each element of A(p..q-1)is less than or equal to A[q],
which is, in turn, less than or equal to each element of
A(q+1..r) Computing the index q is also a part of this
partitioning procedure.
Conquer: Sort the two subarrays A(p..q-1) and A(q+1..r)
by recursive calls to quicksort.
Quick Sort
Combine: Because the subarrays are already sorted, no
work is needed to combine them: the entire array A(p .. r)
is now sorted.
The following procedure implements quicksort:
Quick Sort
The key to the Quicksort algorithm is the PARTITION
procedure, which rearranges the subarray A(p .. r).
Quick Sort
The operation of PARTITION on a sample subarray
A(p .. r).
Quick Sort
Quick Sort
Previous figures show operation of PARTITION on a
sample array. Array entry A[r] becomes the pivot element
x.
Lightly shaded array elements are all in the first partition
with values no greater than x.
Heavily shaded elements are in the second partition with
values greater than x.
The unshaded elements have not yet been put in one of the
first two partitions, and final white element is the pivot x.
(a) Describe the initial array and variable settings. None of
the elements have been placed in either of the first two
partitions.
Quick Sort
(b) The value 2 is “swapped with itself” and put in the
partition of smaller values.
(c)–(d) The values 8 and 7 are added to the partition of
larger values.
(e) The values 1 and 8 are swapped, and the smaller
partition grows.
(f) The values 3 and 7 are swapped, and the smaller
partition grows.
(g)–(h) The larger partition grows to include 5 and 6, and
the loop terminates.
(i) The pivot element is swapped so that it lies between the
two partitions.
Performance of Quicksort
The running time of quicksort depends on whether the
partitioning is balanced or unbalanced, which in turn
depends on which elements are used for partitioning.
If the partitioning is balanced, the algorithm runs
asymptotically as fast as merge sort. If the partitioning is
unbalanced, however, it can run asymptotically as slowly
as insertion sort.
Worst-case partitioning: The worst-case behavior for
quicksort occurs when the partitioning routine produces
one subproblem with n-1 elements and one with 0
elements.
Performance of Quicksort
In Quicksort, worst case will occur if the given array is
already sorted or reverse sorted.
In worst case this unbalanced partitioning arises in each
recursive call. The partitioning costs θ(n) time.
The partitioning costs θ(n) time and the recurrence for the
running time is
T(n) = T(n-1) + T(0) + θ(n)
= T(n-1) + θ(n)
= θ(n2) { By Iteration Method }
Performance of Quicksort
Best-case partitioning: In the most even possible split,
PARTITION produces two subproblems, each of size no
more than n/2. In this case, quicksort runs much faster.
The recurrence for the running time in best case is
T(n) = 2T(n/2) + θ(n)
By case 2 of master theorem,
T(n) = θ(n lg n)
Performance of Quicksort
Average-case partitioning: The average-case running
time of quicksort is much closer to the best case than to
the worst case. First we need to understand how the
balance of the partitioning is reflected in the recurrence
that describes the running time.
Suppose, for example, that the partitioning algorithm
always produces a 9-to-1 proportional split, which at first
seems quite unbalanced. We then obtain the recurrence
T(n) = T(9n/10) + T(n/10) + cn
From the recursive tree method we can easily show that
T(n) = θ(n lg n) {as shown in next slide}
Performance of Quicksort
A randomized version of Quicksort
In exploring the average-case behavior of quicksort, we
have made an assumption that all permutations of the
input numbers are equally likely. In an engineering
situation, however, we cannot always expect this
assumption to hold. We can add randomization to
quicksort algorithm in order to obtain good expected
performance over all inputs.
From our analysis of quicksort algorithm it is clear that
there is a possibility of worst case if the array that we
want to sort is already sorted or reverse sorted, and it is
hapening because we are considering the last element A[r]
as the pivot element. If we select the pivot randomly then
we can make the chances of worst case negligible.
A randomized version of Quicksort
Instead of always using A[r] as the pivot, we will select a
randomly chosen element from the subarray A[p .. r].
We do so by first exchanging element A[r] with an
element chosen at random from A[p .. r]. By randomly
sampling the range p,….,r, we ensure that the pivot
element x =A[r] is equally likely to be any of the r-p+1
elements in the subarray. Because we randomly choose the
pivot element, we expect the split of the input array to be
reasonably well balanced on average.
The changes to PARTITION and QUICKSORT are small.
In the new partition procedure, we simply implement the
swap before actually partitioning:
A randomized version of Quicksort
THANK YOU