[go: up one dir, main page]

0% found this document useful (0 votes)
16 views32 pages

Lecture 12 Sorting Complete

The document discusses different sorting algorithms and their efficiency. It covers selection sort, insertion sort, and merge sort. Selection sort has a time complexity of O(N^2) as it requires N(N-1)/2 comparisons and N-1 swaps. Insertion sort is O(N^2) in worst case but can be O(N) in best case, requiring N(N-1)/2 comparisons and 2(N-1) data moves. Merge sort uses a divide and conquer approach and has a time complexity of O(NLogN), making it one of the most efficient sorting algorithms.

Uploaded by

zarrarh_1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views32 pages

Lecture 12 Sorting Complete

The document discusses different sorting algorithms and their efficiency. It covers selection sort, insertion sort, and merge sort. Selection sort has a time complexity of O(N^2) as it requires N(N-1)/2 comparisons and N-1 swaps. Insertion sort is O(N^2) in worst case but can be O(N) in best case, requiring N(N-1)/2 comparisons and 2(N-1) data moves. Merge sort uses a divide and conquer approach and has a time complexity of O(NLogN), making it one of the most efficient sorting algorithms.

Uploaded by

zarrarh_1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 32

Lecture 12

Sorting Algorithms and Their


Efficiency
Sorting is a Process that Organizes a Collection of
Data into Either Ascending or Descending Order

2
Selection sort
 To Sort an Array into Ascending Order, First Search for the Largest
Element

 Because You Want the Largest Element to be in the Last Position of the
Array, You Swap the Last Item With the Largest Item to be in the Last
Position of the Array, You Swap the Last Item with the Largest Item,
Even if These Items Appear to be Identical

 Now, Ignoring the Last (Largest) Item of the Array, search Rest of the
Array For Its Largest Item and Swap it With Its Last Item, Which is the
Next-to-Last Item in the original Array

 You Continue Until You Have Selected and Swapped N-1 of the N Items
in the Array

 The Remaining Item, Which is Now in the First Position of the Array, is in
its Proper Order

3
Selection sort
Shaded Elements are Selected
Elements in Order are Coloured Red

Initial Array 29 10 14 37 13

After1st Swap 29 10 14 13 37

After2nd Swap 13 10 14 29 37

After3rd Swap 13 10 14 29 37

After4th Swap 10 13 14 29 37

4
Algorithm Analysis
Last= index of the last item of subarray
L= index of largest item found

for (Last=N-1 down to 1)


{
L= a[0];
max_pos = 0; Outer For
Finding the Largest
loop runs
for (j=1 to Last) number
N-1 times
{ N-1 calls to this segment
if (a[j] > L) will make this segment run
L= a[j]; N-1 down to 1
max_pos = j; (N-1)+(N-2)+(N-3)+.....1
} =N*(N-1)/2

Swap Operation
swap(a[max_pos],a[last]); Swap operation requires 3
} moves
So this will require 3 *(N-1) 5

Selection Sort Requires N*(N-1)/2 comparison and N-1 swap operations


Selection sort uses less swap operations so it can be preffered when cost of swap is very high
Or
Discussion
Selection sort is independent of inittial arrangment of data

Selection Sort Requires N*(N-1)/2 comparison and N-1


swap operations

Selection sort usses less swap operations that is O(N) so it


can be preffered when cost of swap is very high

Together selection sort of N items requires


 N*(N-1)/2 + N-1 major operation

The order of Selection sort is quadatric O(N2)

6
Insertion sort
Divide Array Into Sorted Section At Front (Initially
Empty), Unsorted Section At End
Step Iteratively Through Array, Moving Each Element
To Proper Place In Sorted Section

Sorted Unsorted

….. …..

0 i N-1
After i Iterations

7
Insertion sort
Initial Array 29 10 14 37 13 Copy 10

29 29 14 37 13 Shift 29

10 29 14 37 13 Insert 10, Copy 14

10 29 29 37 13 Shift 29

Insert 14; Copy 37


10 14 29 37 13
Insert 37 on Itself

10 14 29 37 13 Copy 13

10 14 14 29 37 Shift 14, 29, 37

Sorted Array 10 13 14 29 37 Insert 13

8
Implementation
 void insertionSort(int numbers[], int N)
{
int i, j, index;

for (i = 1; i < N; i++)


{ Outer loop run
index = numbers[i]; for N-1 time
j = i; Outer loop
This loop executes at moves data
while ((j > 0) && (numbers[j − 1] > index)) items twice per
{ most i time
numbers[j] = numbers[j − 1]; that is in worst case iteration
j = j − 1; 1+ 2+3+....+N-1 times 2*(N-1)
}
So in total N*(N-1)/2
numbers[j] = index;
} times
}

9
Discussion
In worst case of insertion sort requires N*(N-1)/2
comparison and data moves (in the inner loop)

The outer loop will require 2*(N-1) data moves

Together total major operations are


2*N*(N-1)/2+ 2(N-1)
=N2+N-1

Best case O(N) and worst case O(N2)


10
Mergesort
Divide and Conquer Algorithm
Recursively split the input in half
Then recursively merge pairs of pieces
Recursive Steps:
Divide the Array in Half
Sort the Halves
Merge The Halves inside a Temporary Array
Copy Temporary Array to the appropriate locations in
original array

11
Mergesort
The Recursive calls Continue to divide the Array into
Pieces Until Each Piece Contains Only One Item
An Array of One Item is Obviously Sorted
The Algorithm Then Merges These small Pieces Until
One Sorted Array Results

12
Merge sort Example
38 16 27 39 12 27

38 16 27 39 12 27 Recursive
Calls

38 16 27 39 12 27 to
Mergesort

38 16 39 12

16 38 12 39

Merge
16 27 38 12 27 39
Steps

12 16 27 27 38 39

13
Merge the halves

Merging step requires atmost N-1 comparisons


N Moves from original array to temparary array
N Moves from temporary array back to original array
In total 3*N-1 major operations

14
Pseudocode of Merge sort

15
Void Merge(Datatype A[], int F, int Mid, Int L)
{
Int First1=F; int Last1= Mid; //F to mid is first subaaray
Int First2=Mid+1; int Last2=L; //Mid +1 to last i second subarray

Datatype TempArray[MAX_size];

Index= First1; // Next available location in the temp array

For(; (First1<=last1) && (First2<=Last2); index++) // while both subarrays are not empty copy
the { smaller item into the temporary
array
if (A[First1] <A[First2])
{TempArray[index]=A[First1]
++First1;}
Else
{ TempArray[index]=A[First2]
++First2;}
}

For(;First1<=Last1; ++First1,++index) //Finish off the first subarray, if necessary


TempArray[index]=A[first];

For(;First2<=Last2; ++First2,++index) //Finish off the second subarray, if necessary


TempArray[index]=A[first2];

For(index=F; Index<=L; ++Index) //Copy the result Back into the original array
A[index]=TempArray[Index];
16
Algorithm analysis
Merge function called once
Merges all N items and require 3* N-1

At level 1 two calls to merge function


Each call merges N/2 items and requires 2*(3*N/2 -1)
operations
That is 3 *N-2

At level m, 2m calls to merge function


Each call merges N/2m items and requires 2m *(3*N/ 2m -1)
operations. That is 3 *N- 2m
The Merge function is of Order O(N)
Each call to Merge sort Halves the array, this means there
are K=Log N steps
There are O (log N ) steps and each step requires O(N)
operations
Therefore Merge sort is of Order O(N*LogN)
17
Discussion
Merge sort is O(N*LogN) for both average and worst
cases
Merge sort is significantly faster than selection sort or
insertion sort of Order O(N2)
Merge sort though very efficient has one draw back,
merge sort requires a temporary array of size equal to
original array

18
Quicksort
Divide and Conquer algorithm

Quicksort Works by Partitioning the Array into Two


Pieces Separated by a Single Element That is Greater Than
all the Elements in the Left part and Smaller Than all the
Elements in the right part

This Guarantees That, the Single Element , Called the


Pivot Element, is in its Correct position

Then the Algorithm Proceeds, Applying the Same Method


to the Two parts Separately

19
Quicksort
Partition (Divide)
 Choose a pivot
 Find the position for the pivot so that
 all elements to the left are less
 all elements to the right are greater

< pivot pivot >= pivot

20
Quicksort
Conquer
Apply the same algorithm to each half

< pivot >= pivot

< p’ p’ > p’ pivot < p” p” >= p”

21
Partition Algorithm
Must Arrange Items Into Two regions S1, the Set of
Items Less Than the Pivot, and S2, the Set of Items
Greater Than or Equal to Pivot
Different algorithms for Choice of a Pivot
Retain Pivot in A[R] position
The Items That await Placement are in Another Region ,
Called the Unknown Region
S1 S2 Unknown

<P >=P ? P
LastS1 FirstUnknown L R

At Each Step of the partition Algorithm you Examine


One Item from Unknown Region and Place it in S1 or S2
22
How to Perform the MOVEs
 To Move A[FirstUnknown] into S1
Swap
S1 Unknown
S2

<P >=P >=P <P ? P


LastS1 LastS1+1 FirstUnknown L R

 Swap A[FirstUnknown] With the First Item of S2, Which is A[LastS1+1],


and Then Increment S1 by 1

 Thus the Item That Was in A[FirstUnknown] will be at the Rightmost


Position of S1
 Item of S2 That was Moved to A[FirstUnknown]: If you Increment
FirstUnknown by 1, That Item Becomes the Rightmost Member of S2
 Thus, Statements for the Following Steps are Required
 Swap A [FirstUnknown] with A[lastS1+1]
 Increment LastS1
 Increment FirstUnknown
23
How to Perform the MOVEs
 To Move A[FirstUnknown] into S2

S1 S2 Unknown

<P >=P ? P
LastS1 FirstUnknown L R

 Region S2 and Unknown are Adjacent


 Simply Increment FirstUnknown by 1, S2 Expands to the Right

To Move Pivot Between S1 and S2


 Interchange A[LastS1+1], the leftmost Item in S2 with Pivot
 Thus, Pivot Would be in its Actual Location

24
Example: Quicksort (Only one step shown)
Pivot
Choose Pivot, keep it in A[R]
38 12 39 27 16 27
Unknown Pivot FirstUnknown = 0(Points to 38
38 Belongs in S2
38 12 39 27 16 27
S2 Unknown Pivot
S1 is Empty
38 12 39 27 16 27 12 Belongs in S1, swap38 & 12
S1 S2 Unknown Pivot
12 38 39 27 16 27 39 Belongs in S2
S1 S2 Unknown Pivot
12 38 39 27 16 27 27 Belongs in S2
S1 S2 UnknownPivot
12 38 39 27 16 27 16 Belongs in S1, Swap 38 & 16
S1 S2 Pivot
12 16 39 27 38 27 No more Unknown
S1 Pivot S2
16 12 27 39 27 38 Place Pivot between S1 and S2
25
Quicksort Code
P: first element
r: last element
Quicksort(A, p, r)
{
if (p < r)
{
q = Partition(A, p, r)
Quicksort(A, p , q-1)
Quicksort(A, q+1 , r)
}
}

 Initial call is Quicksort(A, 1, n), where n in the length of A


Partition
Clearly, all the action takes place in the
partition() function
Rearranges the subarray in place
End result:
 Two subarrays
 All values in first subarray  all values in second

Returns the index of the “pivot” element separating the


two subarrays
Partition Code
Partition(A, p, r)
{
x = A[r] // x is pivot
i = p - 1
for j = p to r – 1
{
do if A[j] <= x
then
{
i = i + 1
exchange A[i]  A[j]
}
} partition() runs in O(n) time
exchange A[i+1]  A[r]
return i+1
}
Partition Example
i A p=j {2, 8, 7, 1, 3, 5, 6,r 4} pi j r
2 8 7 1 3 5 6 4 2 8 7 1 3 5 6 4
pi j r pi j r
2 8 7 1 3 5 6 4 2 8 7 1 3 5 6 4
p i j r p i j r
2 1 7 8 3 5 6 4 2 11 33 8 7 5 6 4
p i j r p i r
2 11 33 8 7 5 6 4 2 11 33 8 7 5 6 4
p i r
2 11 33 4 7 5 6 8
Discussion
• In worst-case, efficiency is (N2)
– But easy to avoid the worst-case

• On average, efficiency is (N log N)

• Better space-complexity than mergesort.

• In practice, runs fast and widely used

30
Comparison of Sorting
Algorithms
Worst Case
Selection Sort N2
Bubble Sort N2
Insertion Sort N2
Mergesort N * log N
Quicksort N2

31
Online References
http://www.bluffton.edu/~nesterd/java/
SortingDemo.html
http://www.youtube.com/watch?v=y_G9BkAm6B8
http://www.sorting-algorithms.com/quick-sort
http://www.cs.auckland.ac.nz/~jmor159/PLDS210/
ds_ToC.html
http://www.youtube.com/watch?v=vxENKlcs2Tw

32

You might also like