[go: up one dir, main page]

0% found this document useful (0 votes)
13 views20 pages

Task 1 To 4

The document discusses various time complexities of algorithms, including constant, linear, quadratic, logarithmic, log-linear, exponential, and cubic complexities, along with examples and code snippets. It also covers the definition of Big O notation and provides a table of growth orders for different complexities. Additionally, it includes a task on implementing the Quicksort algorithm and measuring its performance.

Uploaded by

vtu17493
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views20 pages

Task 1 To 4

The document discusses various time complexities of algorithms, including constant, linear, quadratic, logarithmic, log-linear, exponential, and cubic complexities, along with examples and code snippets. It also covers the definition of Big O notation and provides a table of growth orders for different complexities. Additionally, it includes a task on implementing the Quicksort algorithm and measuring its performance.

Uploaded by

vtu17493
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 20

Task 1: Algorithmic problem solving 1

AIM

Calculating the time complexities for the given algorithms and estimating the orders of growth

EX1

Constant Complexity - O(1)

• If an algorithm’s time complexity is constant, it means that it will always run in the same amount of
time, no matter the input size. For example, if we want to get the first item of an array, it doesn’t
matter how big the input size is.

#include <stdio.h>
Time Complexity Calculation:
int main()
{
The time complexity of the above-given program is O(1), as this
int a = 4;
int b = 6; program consists of only assignment, arithmetic operations and all
int c; those will be executed only once.
c = a + b;
printf(%d, c);
}

EX2

Linear time – O(n)

An algorithm is said to have a linear time complexity when the running time increases linearly with the length of the
input. When the function involves checking all the values in input data, with this order O(n).
for (int i = 1; i <= n; i += c) Time Complexity Calculation:
{
// some expressions loop variables is incremented / decremented by a constant amount
}

function(int n) {
if (n==1) Time Complexity Calculation:
return;
for (int i=1; i<=n; i++)
Even though the inner loop is bounded by n, but due to break
{
for (int j=1; j<=n; j++) statement it is executing only once.
{
printf("*");
break;
} }

Time Complexity Calculation:


int m=0;
for (int i = 0; i < n; i++) {
m=m+1; for (int i = 0; i < n; i++) // Executed n times
}
Complexity will be O(n)

EX3

Quadratic time – O (n2)

An algorithm is said to have a non-linear time complexity where the running time increases non-linearly (n 2) with
the length of the input. Generally, nested loops come under this order where one loop takes O(n) and if the
function involves a loop within a loop, then it goes for O(n)*O(n) = O(n2) order.

#include <stdio.h>
int main()
Time Complexity Calculation:
{
In the given snippet, the first & the second for loops get executed n
2
int i,j,n = 8;
for (int i = 1; i <= n; i++)
{
for (int j = 1; j <= n; j++)
{
printf("VELTECH\n");
}}}

for (int i = 1; i <=n; i += c)


{ Time Complexity Calculation:
for (int j = 1; j <=n; j += c)
{ • nested loops is equal to the number of times the innermost
statement is executed
// some expressions
} • Selection sort and Insertion Sort have O(n2) time complexity
}

int m=0;
// Executed n times
for (int i = 0; i < n; i++) { Time Complexity Calculation:
m=m+1;
} First for loop executed n times the outer for loop and inner for loop
// outer loop executed n times executed n times so the time complexity is Complexity will be :
for (int i = 0; i < n; i++) {
// inner loop executed n times n+n*n —>O(n2)
for(int j = 0; j < n; j++)
m=m+1;
}

EX4

Logarithmic time – O (log n)

An algorithm is said to have a logarithmic time complexity when it reduces the size of the input data in each step. This
indicates that the number of operations is not the same as the input size. The number of operations gets reduced as the
input size increases. Algorithms are found in binary treesor binary search functions. This involves the search of a given
value in an array by splitting the array into two and starting searching in one split. This ensures the operation is not
done on every element of the data.

for (int i = 1; i <=n; i *= c)


{ Time Complexity Calculation:
// some expressions
} • The loop variables is divided / multiplied by a constant
for (int i = n; i > 0; i /= c) amount
{
• Binary Search has O(Logn) time complexity.
// some expressions
}

EX5

O(nlogn) — A log-linear algorithm

O(nlogn) is known as log-linear complexity. O(n logn) implies that logn operations will occur n times. O(n
logn) time is common in recursive sorting algorithms, sorting algorithms using a binary tree sort, and most
other types of sorts

This time complexity is similar to O(log n) but it executes inside a loop that has time complexity O(n).

int m=0;
Time Complexity Calculation:

Merge Sort is a recursive algorithm and time complexity can be expressed as


following recurrence relation. T(n) = 2T(n/2) + O(n) The solution of the above
recurrence is O(nLogn). The list of size N is divided into a max of Logn parts, and
MERGE SORT the merging of all sublists into a single list takes O(N) time, the worst-case run
time of this algorithm is O(nLogn) Best Case Time Complexity: O(n*log n) Worst
Case Time Complexity: O(n*log n) Average Time Complexity: O(n*log n) The
time complexity of MergeSort is O(n*Log n) in all the 3 cases (worst, average and
best) as the mergesort always divides the array into two halves and takes linear
time to merge two halves.
EX5

Exponential time

Exponential growth is the opposite of Logarithmic growth. Logarithmic growth becomes efficient after a
certain number of inputs but exponential gets slower and inefficient as the input size grows

EX6

Cubic time – O (n3)

An algorithm is said to run in cubic time if the running time of the three loops is proportional to the
cube of N. When N triples, the running time increases by N * N * N.

int m=0; Time Complexity Calculation:


// outer loop executed n/2 times // outer loop executed n/2 times
for (int i = n/2; i < n; i++) { for (int i = n/2; i < n; i++)
// middle loop executed n/2 times // middle loop executed n/2 times
for(int j = n/2; j < n; j++) for(int j = n/2; j < n; j++)
// inner loop executed n times // inner loop executed n times
for(int k=0;k < n; k++ ) for(int k=0;k < n; k++ )
m=m+1; Complexity will be n/2*n/2*n = n3
}

Result
Thus calculating the time complexities for the given coding and estimating the orders of growth
was learn successfully

Task 2: Algorithmic problem solving 2


DEFINITION A function t (n) is said to be in O(g(n)), denoted t (n) € O(g(n)),if t (n) is bounded above by
some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some
nonnegative integer n0 such that
t (n) ≤ cg(n) for all n ≥ n0.

Big oh = F(n)<=C*g(n)
Orders of Growth

n log2n n nlog2n n2 n3 2n n!
Orders of Growth table

Class Name
O(1) Constant
O(log log n) bi-logarithmic or log log n
O(log n) logarithmic or log n
O((log n)k) or O(logkn( poly-logarithmic
O(n0.5)

O(n) Linear
O(n log n) linear logarithmic or n log n
O(n2) Quadratic
O(n2 log n) quadratic logarithmic
O(n3) Cubic
O(2n) base-2 exponential
O(en) natural exponential
O(3n) base-3 exponential
O(n!) Factorial
O(nn) hyper-exponential

Big-O Examples of Algorithms


Notation
O(1) Accessing an array element, Constant loops, Push, Pop, Enqueue (if
there is a tail reference), Dequeue
O(log(n)) Binary search
O(n) Linear search, Summing a 1D-array
O(n log(n)) Heap sort, Quick sort (average-case), Merge sort
O(n2) Selection sort, Insertion sort, Bubble sort, Summing a 2D-array of size
n*n
O(n3) Matrix multiplication
O(2n) Towers of Hanoi, Recursive Fibonacci, Finding the (exact) solution to
the traveling salesman problem (TSP) using dynamic programming
O(n!) Solving the traveling salesman problem via brute-force search
O(nn) Ackermann function

Ex1
loglinear complexity

O(nlogn), also known as loglinear complexity, implies that logn operations will occur n
times. It’s commonly used in recursive sorting algorithms and binary tree sorting
algorithms.

Coding
#include <stdio.h>
// lets take a[5] = {32, 45, 67, 2, 7} as the array to be sorted.
// merge sort function
void mergeSort(int a[], int p, int r)
{
int q;
if(p < r)
{
q = (p + r) / 2;
mergeSort(a, p, q);
mergeSort(a, q+1, r);
merge(a, p, q, r);
}
}

// function to merge the subarrays


void merge(int a[], int p, int q, int r)
{
int b[5]; //same size of a[]
int i, j, k;
k = 0;
i = p;
j = q + 1;
while(i <= q && j <= r)
{
if(a[i] < a[j])
{
b[k++] = a[i++]; // same as b[k]=a[i]; k++; i++;
}
else
{
b[k++] = a[j++];
}
}

while(i <= q)
{
b[k++] = a[i++];
}

while(j <= r)
{
b[k++] = a[j++];
}

for(i=r; i >= p; i--)


{
a[i] = b[--k]; // copying back the sorted list to a[]
}
}

// function to print the array


void printArray(int a[], int size)
{
int i;
for (i=0; i < size; i++)
{
printf("%d ", a[i]);
}
printf("\n");
}

int main()
{
int arr[] = {32, 45, 67, 2, 7};
int len = sizeof(arr)/sizeof(arr[0]);

printf("Given array: \n");


printArray(arr, len);

// calling merge sort


mergeSort(arr, 0, len - 1);

printf("\nSorted array: \n");


printArray(arr, len);
return 0;
}
Complexity Analysis of Merge Sort

Merge Sort is quite fast, and has a time complexity of O(n*log n). It is also a stable sort,
which means the "equal" elements are ordered in the same order in the sorted list.

Ex 2.1 Sort all the functions below in increasing order of asymptotic (Big O) growth. If some
have the same asymptotic growth, then be sure to indicate that. Note: log means base 2.

5n, 4 log n,4log log n,n4,n1/2log4n, n1/2 log4n,(log


n)5logn,nlogn,5n,4n4,44n,55n,55n,nn1/5,nn/4,(n/4)n/4

Solution: 4 log log n < 4 log n < n1/2 log4n < 5n < n4< (log n)5logn < nlogn <
nn1/5< 5n < 55n< (n/4)n/4 < nn/4< 4n4< 44n< 55n

Ex 2.2 Order the functions according to their growth from slowest to fastest growing

6n3, n log6n, 4n, 8n2, log2n, nlog2n, log8n, 64, 82n

Solution: 64, log8n, log2n, 4n, n log6n, nlog2n, 8n2, 6n3, 82n

Result:
Thus Order the functions according to their growth from slowest to fastest growing using Big O
has learnt successfully

Task 3: Divide and Conquer – Quick algorithm

(I)
AIM

To Sort a given set of elements using the Quicksort method and determine the time required to sort
the elements.

ALGORITHM
1. Choose the highest index value has pivot
2. Take two variables to point left and right of the list excluding pivot
3. left points to the low index
4. right points to the high
5. while value at left is less than pivot move right
6. while value at right is greater than pivot move left
7. if both step 5 and step 6 does not match swap left and right
8. if left ≥ right, the point where they met is new pivot

PROGRAM

#include<stdio.h>#incl
ude<sys/time.h>
voidswap(int*x,int*y)
{
int
temp;tem
p=*x;
*x=*y;
*y=temp;
}
voidgenerate_random(inta[],intn)
{
int i;
srand(time(0));
for(i=0;i<n;i++)
a[i]=rand()%1000;
}
intPartition(inta[10],intlint i,j,p;,inth)
{
i=l;
j=h+1;
=l;
do
{
do
{
i=i+1;
}while(a[i]<a[p]);
do{
j=j-1;
}while(a[j]>a[p]);
swap(&a[i],&a[j]);
}while(i<=j);swap
(&a[i],&a[j]);
swap(&a[l],&a[j]);returnj;
}
intQuicksort(inta[10],int m,intn)
{
int
s;if(m<n+
1)
{
s=Partition(a,m,n);
Quicksort(a,m,s-1);
Quicksort(a,s+1,n);
returna;
}
}

intmain()
{
int
a[100000],i,ch,n;str
uct timeval t;double
start,end;FILE*fp;
printf("Enterthetypeof operation\n");
printf("1-Randomly generate numbers and quicksort\
n");printf("2-Enterthenumberofvaluestogenerateand sort\n");
scanf("%d",&ch);
switch(ch)
{
case1fp=fopen("quicksort.txt","w");for(i=10000;i<100000;i=i+5000)
{
generate_random(a,i);gettimeofday(&t,NUL
L);start=t.tv_sec+(t.tv_usec/
1000000.0);Quicksort(a,0,i-
1);gettimeofday(&t,NULL);
end=t.tv_sec+(t.tv_usec/1000000.);
printf("%d\t%lf\n",i,end-
start);fprintf(fp,"%d\t%lf\n",i,end-start);
}
fclose(fp);
break;
case 2:printf("Enter the number of values to be generated\n");
scanf("%d",&i);
generate_random(a,i);
gettimeofday(&t,N
Quicksort(a,0,i-1);
gettimeofday(&t,NULL);
end=t.tv_sec+(t.tv_usec/1000000.0);
printf("%d\t%lf\n",i,end-start);
printf("Sorted numbers are\n");
printf("The sorted array is\n");
for(n=0;n<i;n++)
printf("%d\t",a[n]);break;
default:printf("Invalid choice\n");
}
return0; }

OUTPUT:

Enter the type of operation


1-Randomly generate numbers and quicksort
2-Enter the number of values to generate and sort
1
10000 0.002030
15000 0.001000
20000 0.001996
25000 0.001996
30000 0.002994
35000 0.002993
40000 0.003990
45000 0.003991
50000 0.004987
55000 0.005986
60000 0.006014
65000 0.006983
70000 0.007980
75000 0.007980
80000 0.007980
85000 0.008977
90000 0.008975
95000 0.008976

Test Case 2:
Enter the type of operation
1-Randomly generate numbers and quicksort
2-Enter the number of values to generate and sort
2
Enter the number of values to be generated
10
978 439 857 868 982 985 813 587 318 480
10 0.000000
Sorted numbers are
The sorted array is
318 439 480 587 813 857 868 978 982 985

RESULT: Thus the c program for Quick sort is implemented and running successfully .

Task 4: Divide and Conquer – Merge algorithm

AIM

Using OpenMP, implement a parallelized Merge Sort algorithm to sort a given set of elements and
determine the time required to sort the elements

ALGORITHM
MergeSort(arr[], l, r)

1. Start
2. The elements can be read from a file or can be generated using the random number
generator
3. Declare array and left, right, mid variable
4. If r > l
5. Find the middle point to divide the array into two halves:
6. middle m = l + (r – l)/2
7. Call mergeSort for first half:
8. Call mergeSort(arr, l, m)
9. Call mergeSort for second half:
10. Call mergeSort(arr, m + 1, r)
11. Merge the two halves sorted in steps 2 and 3:
12. Call merge(arr, l, m, r)
13. Perform merge function.

1. if left > right


2. return
3. mid= (left+right)/2
4. mergesort(array, left, mid)
5. mergesort(array, mid+1, right)
6. merge(array, left, mid, right)

14. Write the output in the file


15. Stop.

PROGRAM

#include
<stdio.h>#include
<stdlib.h>#include<s
ys/time.h>#include<o
mp.h>
voidsimplemerge(inta[],intlow,intmid,inthigh)
{
int
i,j,k,c[10000];i=l
ow;
j=mid+1;
k=low;in
t tid;
omp_set_num_threads(5);
{
#pragma omp
paralleltid=omp_get_thread_nu(
);#pragma omp
whilewhile(i<=mid&&j<=high)
{
if(a[i]<a[j])
{
c[k]=a[i];
i++;
k++;
}
else
{
c[k]=a[j];
j++;
k++;
}
}
}
while(i<=mid)
{
c[k]=a[i];
i++;
k++;
}
while(j<=high)
{
c[k]=a[j];
j++;
k++;
}
for(k=low;k<=high;k+
+)a[k]=c[k];
}
voidmerge(inta[],intlow,inthigh)
{
int
mid;if(low<
high)
{
mid=(low+high)/
2;merge(a,low,mid);merge(a,mi
d+1,high);simplemerge(a,low,m
id,high);
}
}
voidgetnumber(inta[],intn)
{
int
i;for(i=0;i<n;i+
+)a[i]=rand()
%10000;
}
intmain()
{
FILE*fp;
int
a[10000],i;str
ucttimevaltv;
double
start,end,elapse;fp=f
open("m.txt","w");pr
intf("Num.time");for
(i=0;i<9999;i+=100)
{
getnumber(a,i);gettimeofday(&tv,N
ULL);start=tv.tv_sec+(tv.tv_usec/
100000.0);merge(a,0,i-
1);gettimeofday(&tv,NULL);end=t
v.tv_sec+(tv.tv_usec/
100000.0);elapse=end-start;
if(elapse>=0)
fprintf(fp,"%d\t%lf\n",i,elapse);
}
fclose(fp);system("gnuplot")
;
return0;
OUTPUT:
guest-4er71n@user-HP-280-G4-MT-Business-PC:~$ cc -fopenmp merge.c
guest-4er71n@user-HP-280-G4-MT-Business-PC:~$ ./a.out
Num. time
guest-4er71n@user-HP-280-G4-MT-Business-vi m.txt

Num. Time

0 0.000010

500 0.051020

1000 0.092640

1500 0.091000

2000 0.137790

2500 0.112610

3000 0.049490

3500 0.046230

4000 0.050930

4500 0.080480

5000 0.061260

5500 0.069880

6000 0.078240

6500 0.082940

7000 0.120360

7500 0.092320
8000 0.100990

8500 0.108600

9000 0.113700

9500 0.119730

RESULT:Thus thr c program for parallelized Merge Sort using OpenMP is implemented
suceecfully.

You might also like