[go: up one dir, main page]

0% found this document useful (0 votes)
13 views32 pages

3 - Sorting Algos

The document discusses various sorting algorithms like bubble sort, insertion sort, selection sort, quick sort, merge sort, heap sort, bucket sort and counting sort. It provides examples and analysis of time complexity for each algorithm.

Uploaded by

Mishank Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views32 pages

3 - Sorting Algos

The document discusses various sorting algorithms like bubble sort, insertion sort, selection sort, quick sort, merge sort, heap sort, bucket sort and counting sort. It provides examples and analysis of time complexity for each algorithm.

Uploaded by

Mishank Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

CS1301:

DATA STRUCTURES
Sorting Techniques: Bubble Sort, Insertion Sort, Selection
Sort, Quick Sort, Merge Sort, Heap Sort, Bucket Sort
begin BubbleSort(arr)
for all array elements
if arr[i] > arr[i+1]

Bubble Sort
swap(arr[i], arr[i+1])
• Sort list of n elements
• In each iteration, find the maximum elements in remaining end
BubbleSort

unsorted list and move it to end. Do this in max m-1 iteration by


comparing successive elements where m is the number of elements
remaining in unsorted list at any given stage.

end if
end for
return arr

Bubble Sort: Example


Bubble Sort: Example(cont’d)
Bubble Sort: Analysis
Worst case: If list is sorted in descending order, then• Number of
comparison: 1+2+3+…+n-1=n(n-1)/2 • Number of swaps:
1+2+3+…+n-1=n(n-1)/2 • Time complexity= O(n2)

Best Case: If list is sorted in ascending order, then • Number of


comparison: 1+2+3+…+n-1 = n(n-1)/2 • Number of swaps: 0(no
swaps) required • Time complexity=O(n2)
i←1
while i < length(A)

Insertion Sort
j←j-1
i←i+1
• Single element is considered to be sorted • Add one more
element in each iteration and insert in existing sorted list.
Make space for newly picked element by moving elements
towards right.

j←i
while j > 0 and A[j-1] > A[j]
swap A[j] and A[j-1]

end while

end while

Insertion Sort: Example


7
Insertion Sort: Analysis
Worst Case: If list is sorted in descending order • Number of
comparison: 1+2+3+…+n-1 = n(n-1)/2 • Number of swaps:
1+2+3+…+n-1= n(n-1)/2 • Time complexity= O(n2)

Best Case: If list is sorted in ascending order •


Number of comparison: 1+1+1+…+1(n-1 times) = n-1•
Number of swaps: 0(no swaps)
• Time complexity= O(n)
• Sort list of n elements by selecting element for each

Selection Sort
position, starting from first position • In each iteration, list of
unsorted elements shrink from front

Step 1 − Set MIN to location 0


Step 2 − Search the minimum element in the list Step 3 − Swap with
value at location MIN Step 4 − Increment MIN to point to next element
Step 5 − Repeat until list is sorted
24613
st 24653
2 4 6 5 31 iteration, swap(a[0],a[4]) 3rd iteration, swap(a[2],a[5]) 2 3 6

5 4 4th iteration, swap(a[3],a[5])

Selection sort: Example 2 3 4 5 6 5th iteration,


swap(a[4],a[4])
5

23456

1 1

1 1

1 2nd iteration, swap(a[1],a[1])

Selection Sort: Analysis


Worst case:
• Number of comparison: n-1+n-2+…+2+1 = n(n-1)/2• Number
of swaps: 1+1+1+…+1(n-1 times)= n • Time complexity =
O(n2)

Best Case:
• Number of comparison: n-1+n-2+…+2+1 = n(n-1)/2• Number
of swaps: 1+1+1+…+1(n-1 times)= n • Time complexity =
O(n2)
• Build heap using BuildHeap method for array elements

Heap Sort

• In each iteration, delete minimum element from top and


store at last position of heap(so far unsorted array) • In every
iteration, one more element get added to sorted half of the
array
• Finally we will get array sorted in descending order • To sort
array in ascending order, use max-heap
heapSort(int a[], int n) {
Heap Sort: Algo
• BuildHeap has time complexity: O(n)
BuildHeap(a, n);
for(i=n; i>1; i--) {
a[i-1] = deleteMin(a,i);
}
}

Time Complexity:

• Deleting root element from heap and heapifying: O(log n)•


Deleting n elements therefore will take: O(n log n) • Total
time required: O(n log n)
2 4 6 1 3 BuildHeap 2 3 6 5 4 Delete minimum and move at a[5] 4 3 6 5 1

Delete minimum and move at a[4] 4 5 6 2 1 Delete minimum and move at a[3] 6

5 3 2 1 Delete minimum and move at a[2]

Heap Sort: Example


6 4 3 2 1 Delete minimum and move at a[1]
5

54321
1
2

Merge Sort
• Divide array into two halves, recursively sort left and right halves,
then merge two halves
• Keep dividing until only one element remains

Merge sort: Example 8 2 9 4 5 3 1 6

• Divide it in two at the midpoint


• Conquer each side in turn (by recursively sorting) • Merge
two halves together
82945316
Merge Sort: Example(cont’d) 2 8 4 9 3
516
24891356
82945316
Divide
8294
5316
Divide
82945316
Divide
1 element
Merge

Merge

Merge
12345689
Merge Sort: Algo
mergeSort(Arr, start, end) {
if(i == j) return;
else {
mid= (start+end)/2;
mergeSort(Arr, start, mid);
mergeSort(Arr, mid+1, end);
merge(Arr, start, mid, end); // merge two sorted half }
}
Merge Algo: Analysis
• Merging sorted array of total length n will take: O(n) •
Recurrence relation for time complexity of mergeSort:
T(n)=2T(n/2) + n

• Solving this will give: T(n)=O(n log n)

Quick Sort
• Using a pivot element, divide list in two halves which are a[i] <
pivot and a[i] > pivot
• In each iteration, we find right position for pivot element picked.
• Run quick sort for two halves recursively • Pivot
elements can be picked based on multiple techniques
• Basic technique is to pick first element
31 57
Quick Sort: Example 0 13 43 26 31 57

65 75 81 92

S 0 65 81 92 13 43 26 31 57 75
S is sorted
S select pivot value1381
43
75
92 0
26
65

S1 0S2 partition S31


75
43
65
13 81
92 26 57

S2QuickSort(S1) andQuickSort(S2)
S1
Quick Sort: Best case
• Ideally, Pivot element should divide list in two parts • Hence
recurrence relation:
T(n) = 2T(n/2) + n

• Solving this would give: T(n)=O(n log n)

Quick sort: worst case


• If array is already sorted, pivot will divide array into two halves
with 0 and n-1 elements
• Recurrence relation in worst case therefore would be: T(n) =
T(n-1) + n

2
• Solving this gives: T(n) = O(n )
Comparison-based Sorting
• Several sorting algorithms have been discussed and the best
ones, so far:
• Heap sort and Merge sort: O( n log n )
2
• Quick sort (best one in practice): O( n log n ) on average, O( n ) worst
case

• Can we do better than O( n log n )? • No.


• It is be proven that any comparison-based sorting algorithm will need to
carry out at least O( n log n ) operations

Non-comparison based Sorting


• Suppose the values in the list to be sorted can repeat but the
values have a limit (e.g., values are digits from 0 to 9)• Sorting, in
this case, appears easier
• Is it possible to come up with an algorithm better than O( n log n
)?
• Yes
• Strategy will not involve comparisons

Counting Sort
• Idea: suppose the values are in the range 0..m-1; start with m
empty buckets numbered 0 to m-1, scan the list and increase
the count of elements in the bucket

• Will need an array of buckets, and the values in the list to be


sorted will be the indexes to the buckets • No comparisons will be
necessary
4212032140230
Counting Sort: Example 0 0 0 1 1 2 2 2 2

3344

2
0 2
0
1
2
3
4
0
1
2
3
4
Counting Sort: Algo
CountingSort(S) { //(values in S are between 0 and m-1 ) for(j=0; j<m;
j++) // initialize m buckets b[j] = 0;
for(i=0; i<n; i++) // place elements in their b[S[i]] = b[S[i]] + 1; //
appropriate buckets

i=0;
for(j=0; j<m; j++) { // place elements in buckets for(r=1; r<=b[j]+1; r++)
{ // back in S S[i] =j;
i=i + 1;
}
}
}
Counting Sort: Analysis
• Bucket initialization: O( m )
• From array to buckets: O( n )
• From buckets to array: O( n )
• Since m will likely be small compared to n, Bucket
sort is O( n )
• Strictly speaking, time complexity is O ( n + m )

Sorting integers
• Can we perform counting sort on any array of
(non-negative) integers?
• Yes, but note that the number of buckets will depend on
the maximum integer value
• If you are sorting 1000 integers and the maximum value
is 999999, you will need 1 million buckets! • Time
complexity is not really O( n ) because m is much > than n. Actual
time complexity is O( m ) • Can we do better?

Bucket Sort
• Idea: Divide n elements into M buckets. Sort each bucket
independently using insertion sort. Finally sequentially access all
the elements from buckets starting from first bucket. Resultant
array is sorted array.
• Array to be sorted can have any non-negative integers and
should be uniformly distributed i.e. should not be the case that
all elements go into one bucket.
Bucket Sort: Example
Radix Sort
• Idea: repeatedly sort by digit—perform multiple bucket
sorts on S starting with the rightmost digit • If maximum
value is 999999, only ten buckets (not 1 million) will be
necessary • Use this strategy when the keys are
integers, and there is a reasonable limit on their values •
Number of passes (bucket sort stages) will depend on the
number of digits in the maximum value
12 58 37 64 52 36 99 63 18 9 20 88 47
st
Radix Sort: Example (1 Pass) 20 12 52

63 64 36 37 47 58 18 88 9 99
58
12 9
20 18
37
52 63 64 36 47

99
88
20 12 52 63 64 36 37 47 58 18 88 9 99
nd
Radix Sort: Example(2 Pass) 9 12 18

20 36 37 47 52 58 63 64 88 99912

36
52
63
18 20
37 47
58
64 88 99
12 58 37 64 52 36 99 63 18 9 20 88 4720 12 52 63 64 36 37
47 58 18 88 9 99

Radix Sort: Example (Both Passes)


9 12 18 20 36 37 47 52 58 63 64 88 99sort by rightmost digit sort by
leftmost digit

Radix Sort: Analysis


• If there is a fixed number p of bucket sort stages (six stages in
the case where the maximum value is 999999), then radix sort is
O( n )
• There are p bucket sort stages, each taking O( n ) time • Strictly
speaking, time complexity is O(pn), where p is the number of
digits (note that p = log10m, where m is the maximum value in
the list)
Radix Sort: For Words
• Note that only 10 buckets are needed regardless of number of
stages since the buckets are reused at each stage
• Radix sort can apply to words
• Set a limit to the number of letters in a word • Use 27 buckets (or more,
depending on the letters/characters allowed), one for each letter plus a
“blank” character • The word-length limit is exactly the number of bucket sort
stages needed

You might also like