[go: up one dir, main page]

0% found this document useful (0 votes)
52 views19 pages

Daa L2

Uploaded by

Hare Krishna Raj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views19 pages

Daa L2

Uploaded by

Hare Krishna Raj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Apex Institute of Technology

Department of Computer Science & Engineering


Bachelor of Engineering (Computer Science & Engineering)
Design and Analysis of Algorithms– (21CSH-282)
Prepared By: Vikas Kumar (E13657)

07/30/2024 DISCOVER . LEARN . EMPOWER


1
Contents

 Asymptotic Notations
 OH-Notation
 Omega Notation
 Theta Notation
 Order of growth

2
Asymptotic notations

Asymptotic analysis is a useful tool to help to structure our thinking toward better algorithm
• We shouldn’t ignore asymptotically slower algorithms, however.
• Real-world design situations often call for a careful balancing

o Asymptotic complexity is a way of expressing the cost of an algorithm, using idealized units of
computational work. Consider, for example, the algorithm for sorting a deck of cards, which proceeds by
repeatedly searching through the deck for the lowest card. The asymptotic complexity of this algorithm is
the square of the number of cards in the deck.
Note that we think about bounds on the performance of algorithms, rather than giving exact speeds.
o The actual number of steps required to sort our deck of cards (with our naive quadratic algorithm) will
depend upon the order in which the cards begin.
o The actual time to perform each of our steps will depend upon our processor speed, the condition of our
processor cache, etc., etc.

3
BIG-O NOTATION
• Big-O is the formal method of expressing the upper bound of an algorithm's running time. It's a measure
of the longest amount of time it could possibly take for the algorithm to complete.
• More formally, for non-negative functions, f(n) and g(n),
• if there exists an integer n0 and a constant c > 0 such that for all integers n > n0, f(n) ≤ cg(n),
• then f(n) is Big O of g(n).
• This is denoted as "f(n) = O(g(n))".
• If graphed, g(n) serves as an upper bound to the curve you are analyzing, f(n).

4
EXAMPLE
Example: n2 + n = O(n3)
Proof:
• Here, we have f (n) = n2 + n, and g(n) = n3
• Notice that if n ≥ 1, n ≤ n3 is clear.

• Also, notice that if n ≥ 1, n2 ≤ n3 is clear.

• Side Note: In general, if a ≤ b, then na ≤ nb whenever n ≥ 1. This fact is used often in these types of
proofs.
• Therefore,

n2 + n ≤ n3 + n3 = 2n3
• We have just shown that
n2 + n ≤ 2n3 for all n ≥ 1
• Thus, we have shown that n2 + n = O(n3)

(by definition of Big- O, with n0 = 1, and c = 2.)

5
BIG-OMEGA NOTATION
 For non-negative functions, f(n) and g(n), if there exists an integer n0 and a constant c > 0 such that
for all integers n > n0, f(n) ≥ cg(n), then f(n) is omega of g(n).
 This is denoted as "f(n) = Ω(g(n))".
 This is almost the same definition as Big Oh, except that "f(n) ≥ cg(n)", this makes g(n) a lower
bound function, instead of an upper bound function.
 It describes the best that can happen for a given data size.

•Omega is the reverse of Big-Oh.


•Omega gives us a LOWER BOUND on a function.
•Big-Oh says, "Your algorithm is at least this good."
•Omega says, "Your algorithm is at least this bad."

6
EXAMPLE
Example: n3 + 4n2 = Ω(n2)
Proof:
• Here, we have f (n) = n3 + 4n2, and g(n) = n2
• It is not too hard to see that if n ≥ 0,

n3 ≤ n3 + 4n2
• We have already seen that if n ≥ 1,
n2 ≤ n3
• Thus when n ≥ 1,
n2 ≤ n3 ≤ n3 + 4n2
• Therefore,
1n2 ≤ n3 + 4n2 for all n ≥ 1
• Thus, we have shown that n3 + 4n2 = Ω(n2) (by definition of Big- Ω, with n0 = 1, and c = 1.)

7
THETA Notation
• The function f(n)=theta(g(n))(read as ―f of n is theta of g of n)
• iff there exist positive constants c1,c2, and n0 such that c1g(n) <=f(n)<=c2g(n) for all n, n>=n0.
• The theta notation is more precise than both the big oh and big omega notations.
• The function f(n)=theta(g(n)) iff g(n) is both the lower and upper bound of f(n).

8
EXAMPLE

Example: n2 + 5n + 7 = Θ(n2)
Proof:
• When n ≥ 1,
n2 + 5n + 7 ≤ n2 + 5n2 + 7n2 ≤ 13n2
• When n ≥ 0,
n2 ≤ n2 + 5n + 7
• Thus, when n ≥ 1
1n2 ≤ n2 + 5n + 7 ≤ 13n2
Thus, we have shown that n2 + 5n + 7 = Θ(n2) (by definition of Big- Θ, with n0 = 1, c1 = 1,
and c2 = 13.)

9
Commonly used Asymptotic Notations

Type Big O notation


Constant O(1)
Linear O(n)
Logarithmic O(log n)
N log n O(n log(n))
exponential O(2n)
cubic O(n3)
polynomial nO(1)
quadratic O(n2)

Run time performance of the above complexities are :(Ascending Order)

O(1) < O(log n) < O(n) < O(n log n) < O(n2) < O (n3)< O(2n) < O(n!)

10
RATE OF GROWTH

 Let, the performance of X & Y are TX and TY


respectively,
 Conclusion is for a large n, TX(n) ≻ TY (n)
 So, who is a better professional ?
 Ans. X
 When X outperforms Y and by what factor ?
 Ans. N0 and c
11
STEP COUNT
• Instructions / code Step Count
• x=x+1 ---------------------------------------------constant(c)
• int main(){
• printf (“Chandigarh UNIVERSITY”);
• return 0;
• }
• for(i=1 ; i<=n ; i++) ---------------------------------------n

• x=x+1;
• for(i=1 ; i<=n ; i++)
• for(j=1 ; j<=n ; j++) ------------------------------------ n2
• x=x+1;

12
Example

• 1. int i, j, n=5, a[5]={12, 2, 51, 35, 7};


• 2. for(i=0;i<n-1;i++){
• 3. for(j=0;j<n-1-i;j++)
• 4. if(a[j]>a[j+1]){
• 5. a[j] = a[j] + a[j+1];
• 6. a[j+1] = a[j] - a[j+1];
• 7. a[j] = a[j] - a[j+1]; }
• 8. for(i=0 ; i<n ; i++)
• 9. printf(“%d”, a[i]);

• T(n) = ?
13
COMMON COMPLEXITY FUNCTIONS
Comparison of functions
Many of the relational properties of real numbers apply to asymptotic comparisons as well. For the
following, assume that f(n) and g(n) are asymptotically positive.

 Transitivity:
• f(n)=(g(n)) and g(n)=(h(n)) imply f(n)=(h(n)),
• f(n)=O(g(n)) and g(n)=O(h(n)) imply f(n)=O(h(n)),
• f(n)=Ω(g(n)) and g(n)=Ω(h(n)) imply f(n)=Ω(h(n)),
• f(n)=o(g(n)) and g(n)=o(h(n)) imply f(n)=o(h(n)),
• f(n)=ω(g(n)) and g(n)=ω(h(n)) imply f(n)=ω(h(n)).

 Reflexivity:
• f(n)=(f(n)),
• f(n)=O(f(n)),
• f(n)=Ω(f(n)).

 Symmetry:
• f(n)=(g(n)) if and only if g(n)=(f(n)). 14
COMMON COMPLEXITY FUNCTIONS
Transpose symmetry
• f(n)=O(g(n)) if and only if g(n)=Ω(f(n)),
• f(n)=O(g(n)) if and only if g(n)=Ω(f(n)),

Because of these properties hold for asymptotic notations, one can draw an analogy between the
asymptotic comparison of two functions f and g and the comparison of two real numbers a and b:
• f(n)=O(g(n)) similar to a<=b,
• f(n)=Ω(g(n)) similar to a>=b,
• f(n)=(g(n)) similar to a=b,
• f(n)=o(g(n)) similar to a<b,
• f(n)=ω(g(n)) similar to a>b.

15
TASKS END OF LECTURE LEARNING (TELL):

TASK 1:
Missing socks Imagine that after washing 5 distinct pairs of socks, you discover that two socks are missing.
Of course, you would like to have the largest number of complete pairs remaining. Thus, you are left with 4
complete pairs in the best-case scenario and with 3 complete pairs in
the worst case. Assuming that the probability of disappearance for each of the 10 socks is the same, find the
probability of the best-case scenario; the probability of the worst-case scenario; the number of pairs you
should expect in the average case.
TASK 2:
For each of the following pairs of functions, indicate whether the first function of each of the following pairs
has a lower, same, or higher order of growth (to within a constant multiple) than the second function.
a. n(n + 1) and 2000n^2 b. 100n^2 and 0.01n^3
c. log2 n and ln n d. 2n−1 and 2n
e. (n − 1)! and n!

16
TASKS END OF LECTURE LEARNING (TELL):
TASK 3:
List the following functions according to their order of growth from the lowest to the highest:
(n − 2)!, 5 lg(n + 100)^10, 2^2n, 0.001n^4 + 3n^3 + 1, ln^2 n, 3√n, 3^n.
TASK 4:
Prove the following assertions by using the definitions of the notations involved, or disprove them by giving a
specific counterexample.
a. If t (n) ∈ O(g(n)), then g(n) ∈ Omega(t (n)).
b. theta(αg(n)) = theta(g(n)), where α >0.
c. theta(g(n)) = O(g(n)) ∩ Omega(g(n)).
TASK 5:
Use the informal definitions of O, Omega, and theta to determine whether the following assertions are true or
false.
a. n(n + 1)/2 ∈ O(n^3) b. n(n + 1)/2 ∈ O(n^2)
17
c. n(n + 1)/2 ∈ theta(n^3) d. n(n + 1)/2 ∈ Omega(n)
References
• Fundamentals of Computer Algorithms 2nd Edition (2008) by Horowitz, Sahni
and Rajasekaran
• Introduction to Algorithms 3rd Edition (2012) by Thomas H Cormen, Charles E
Lieserson, Ronald

18
THANK YOU

For queries
Email: vikash.e13657@cumail.in

07/30/2024 19

You might also like