[go: up one dir, main page]

0% found this document useful (0 votes)
32 views4 pages

A Use of

The document discusses algorithm analysis using asymptotic notations, particularly Theta (Θ) notation for tight bounds. It compares the substitution method and master method for solving recurrences, provides time complexity for merge sort, and outlines various design strategies in algorithm design. Additionally, it explains the Master Method cases with examples and details the selection sort algorithm with a specific array.

Uploaded by

2301109157cse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views4 pages

A Use of

The document discusses algorithm analysis using asymptotic notations, particularly Theta (Θ) notation for tight bounds. It compares the substitution method and master method for solving recurrences, provides time complexity for merge sort, and outlines various design strategies in algorithm design. Additionally, it explains the Master Method cases with examples and details the selection sort algorithm with a specific array.

Uploaded by

2301109157cse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

a.

Use of asymptotic notations:

They describe the running time or space requirement of an algorithm in


terms of input size, helping to analyze its efficiency.

Theta (Θ) notation:

It represents the tight bound of an algorithm, i.e., it gives both the upper and
lower bounds:

Θ(f(n)) means the algorithm takes exactly f(n) time asymptotically.

b.

Substitution method vs Master method:

Substitution method: Assumes a solution and uses mathematical induction to


prove it.

Master method: Directly provides the solution for recurrence relations of the
form T(n) = aT(n/b) + f(n) using predefined cases.

c.

Time complexity of merge sort:

O(n log n) in all cases (worst, average, and best).

d.

T(n) = 2T(n - 1) + 1, T(1) = 1

Using substitution:

T(n) = 2T(n-1) + 1

= 2[2T(n-2) + 1] + 1 = 4T(n-2) + 2 + 1

= 8T(n-3) + 4 + 2 + 1

=…

= 2ⁿ⁻¹T(1) + (2ⁿ⁻² + 2ⁿ⁻³ + … + 1)

= 2ⁿ⁻¹ + (2ⁿ⁻¹ - 1) = 2ⁿ - 1
So, T(n) = 2ⁿ - 1

e.

Design strategies in algorithm design:

1. Divide and Conquer

2. Greedy Method

3. Dynamic Programming

4. Backtracking

5. Branch and

Bound

2(a) T(n) = 2T(n / 2) + n log n, T(1) = 1

Use Master Method:

Compare with: T(n) = aT(n/b) + f(n)

Here, a = 2, b = 2, and f(n) = n log n

Now, compute n^log_b a = n^log₂2 = n

Compare f(n) = n log n with n^log_b a = n

We see:

f(n) = n log n = n × log n = n^1 × log n, which is slightly more than n, i.e., f(n) =
Ω(n^1+ε) for any ε > 0, but it does not satisfy the regularity condition.

So this is Case 2 of the Master Theorem (f(n) = Θ(n^log_b a × log^k n)) with k =
1

So, T(n) = Θ(n log² n)

2(b)

T(n) = T(n / 3) + T(2n / 3) + n, T(1) = 1

Use Recursion Tree Method:

At each level, work done = n

Subproblem sizes = n/3 and 2n/3


Depth of tree ≈ log₃ n (since the largest branch reduces by a factor of 3 each
time)

Each level does Θ(n) work, and the number of levels = Θ(log n)

So, T(n) = Θ(n log n)

3(a) Cases of Master Method with Example

Master Method is used to solve recurrences of the form:

T(n) = aT(n/b) + f(n), where

a ≥ 1, b > 1, and f(n) is an asymptotically positive function.

Case 1: f(n) = O(n^log_b a - ε) for some ε > 0

T(n) = Θ(n^log_b a)

Example: T(n) = 2T(n/2) + n

Here, a = 2, b = 2, log_b a = log₂2 = 1

f(n) = n = n^1 = Θ(n^log_b a)

So, T(n) = Θ(n log n)

Case 2: f(n) = Θ(n^log_b a × log^k n) for some k ≥ 0

T(n) = Θ(n^log_b a × log^(k+1) n)

Example: T(n) = 2T(n/2) + n log n

a = 2, b = 2, log_b a = 1

f(n) = n log n = Θ(n^1 × log^1 n)

So, T(n) = Θ(n log² n)

Case 3: f(n) = Ω(n^log_b a + ε) for some ε > 0, and regularity condition holds:

If a f(n/b) ≤ c f(n) for some c < 1, then:

T(n) = Θ(f(n))

Example: T(n) = 2T(n/2) + n²

a = 2, b = 2, log_b a = 1

f(n) = n² = Ω(n^1+ε), ε = 1
Regularity holds: 2(n/2)² = n²/2 ≤ c × n²

So, T(n) = Θ(n²)

3(b) Selection Sort on Array (10, 2, 4, 56, 4, 58)

Array: (10, 2, 4, 56, 4, 58)

Selection Sort Algorithm:

void selectionSort(int A[], int n) {

for (int i = 0; i < n - 1; i++) {

int min_idx = i;

for (int j = i + 1; j < n; j++) {

if (A[j] < A[min_idx])

min_idx = j;

// Swap

int temp = A[i];

A[i] = A[min_idx];

A[min_idx] = temp;

Pass-by-pass sorting:

Pass 1: Find min (2), swap with 1st → (2, 10, 4, 56, 4, 58)

Pass 2: Find min (4), swap with 2nd → (2, 4, 10, 56, 4, 58)

Pass 3: Find min (4), swap with 3rd → (2, 4, 4, 56, 10, 58)

Pass 4: Find min (10), swap with 4th → (2, 4, 4, 10, 56, 58)

Pass 5: Already sorted → (2, 4, 4, 10, 56, 58)

Sorted after 5 passes.

You might also like