Daa L2
Daa L2
Asymptotic Notations
OH-Notation
Omega Notation
Theta Notation
Order of growth
2
Asymptotic notations
Asymptotic analysis is a useful tool to help to structure our thinking toward better algorithm
• We shouldn’t ignore asymptotically slower algorithms, however.
• Real-world design situations often call for a careful balancing
o Asymptotic complexity is a way of expressing the cost of an algorithm, using idealized units of
computational work. Consider, for example, the algorithm for sorting a deck of cards, which proceeds by
repeatedly searching through the deck for the lowest card. The asymptotic complexity of this algorithm is
the square of the number of cards in the deck.
Note that we think about bounds on the performance of algorithms, rather than giving exact speeds.
o The actual number of steps required to sort our deck of cards (with our naive quadratic algorithm) will
depend upon the order in which the cards begin.
o The actual time to perform each of our steps will depend upon our processor speed, the condition of our
processor cache, etc., etc.
3
BIG-O NOTATION
• Big-O is the formal method of expressing the upper bound of an algorithm's running time. It's a measure
of the longest amount of time it could possibly take for the algorithm to complete.
• More formally, for non-negative functions, f(n) and g(n),
• if there exists an integer n0 and a constant c > 0 such that for all integers n > n0, f(n) ≤ cg(n),
• then f(n) is Big O of g(n).
• This is denoted as "f(n) = O(g(n))".
• If graphed, g(n) serves as an upper bound to the curve you are analyzing, f(n).
4
EXAMPLE
Example: n2 + n = O(n3)
Proof:
• Here, we have f (n) = n2 + n, and g(n) = n3
• Notice that if n ≥ 1, n ≤ n3 is clear.
• Side Note: In general, if a ≤ b, then na ≤ nb whenever n ≥ 1. This fact is used often in these types of
proofs.
• Therefore,
n2 + n ≤ n3 + n3 = 2n3
• We have just shown that
n2 + n ≤ 2n3 for all n ≥ 1
• Thus, we have shown that n2 + n = O(n3)
5
BIG-OMEGA NOTATION
For non-negative functions, f(n) and g(n), if there exists an integer n0 and a constant c > 0 such that
for all integers n > n0, f(n) ≥ cg(n), then f(n) is omega of g(n).
This is denoted as "f(n) = Ω(g(n))".
This is almost the same definition as Big Oh, except that "f(n) ≥ cg(n)", this makes g(n) a lower
bound function, instead of an upper bound function.
It describes the best that can happen for a given data size.
6
EXAMPLE
Example: n3 + 4n2 = Ω(n2)
Proof:
• Here, we have f (n) = n3 + 4n2, and g(n) = n2
• It is not too hard to see that if n ≥ 0,
n3 ≤ n3 + 4n2
• We have already seen that if n ≥ 1,
n2 ≤ n3
• Thus when n ≥ 1,
n2 ≤ n3 ≤ n3 + 4n2
• Therefore,
1n2 ≤ n3 + 4n2 for all n ≥ 1
• Thus, we have shown that n3 + 4n2 = Ω(n2) (by definition of Big- Ω, with n0 = 1, and c = 1.)
7
THETA Notation
• The function f(n)=theta(g(n))(read as ―f of n is theta of g of n)
• iff there exist positive constants c1,c2, and n0 such that c1g(n) <=f(n)<=c2g(n) for all n, n>=n0.
• The theta notation is more precise than both the big oh and big omega notations.
• The function f(n)=theta(g(n)) iff g(n) is both the lower and upper bound of f(n).
8
EXAMPLE
Example: n2 + 5n + 7 = Θ(n2)
Proof:
• When n ≥ 1,
n2 + 5n + 7 ≤ n2 + 5n2 + 7n2 ≤ 13n2
• When n ≥ 0,
n2 ≤ n2 + 5n + 7
• Thus, when n ≥ 1
1n2 ≤ n2 + 5n + 7 ≤ 13n2
Thus, we have shown that n2 + 5n + 7 = Θ(n2) (by definition of Big- Θ, with n0 = 1, c1 = 1,
and c2 = 13.)
9
Commonly used Asymptotic Notations
O(1) < O(log n) < O(n) < O(n log n) < O(n2) < O (n3)< O(2n) < O(n!)
10
RATE OF GROWTH
• x=x+1;
• for(i=1 ; i<=n ; i++)
• for(j=1 ; j<=n ; j++) ------------------------------------ n2
• x=x+1;
12
Example
• T(n) = ?
13
COMMON COMPLEXITY FUNCTIONS
Comparison of functions
Many of the relational properties of real numbers apply to asymptotic comparisons as well. For the
following, assume that f(n) and g(n) are asymptotically positive.
Transitivity:
• f(n)=(g(n)) and g(n)=(h(n)) imply f(n)=(h(n)),
• f(n)=O(g(n)) and g(n)=O(h(n)) imply f(n)=O(h(n)),
• f(n)=Ω(g(n)) and g(n)=Ω(h(n)) imply f(n)=Ω(h(n)),
• f(n)=o(g(n)) and g(n)=o(h(n)) imply f(n)=o(h(n)),
• f(n)=ω(g(n)) and g(n)=ω(h(n)) imply f(n)=ω(h(n)).
Reflexivity:
• f(n)=(f(n)),
• f(n)=O(f(n)),
• f(n)=Ω(f(n)).
Symmetry:
• f(n)=(g(n)) if and only if g(n)=(f(n)). 14
COMMON COMPLEXITY FUNCTIONS
Transpose symmetry
• f(n)=O(g(n)) if and only if g(n)=Ω(f(n)),
• f(n)=O(g(n)) if and only if g(n)=Ω(f(n)),
Because of these properties hold for asymptotic notations, one can draw an analogy between the
asymptotic comparison of two functions f and g and the comparison of two real numbers a and b:
• f(n)=O(g(n)) similar to a<=b,
• f(n)=Ω(g(n)) similar to a>=b,
• f(n)=(g(n)) similar to a=b,
• f(n)=o(g(n)) similar to a<b,
• f(n)=ω(g(n)) similar to a>b.
15
TASKS END OF LECTURE LEARNING (TELL):
TASK 1:
Missing socks Imagine that after washing 5 distinct pairs of socks, you discover that two socks are missing.
Of course, you would like to have the largest number of complete pairs remaining. Thus, you are left with 4
complete pairs in the best-case scenario and with 3 complete pairs in
the worst case. Assuming that the probability of disappearance for each of the 10 socks is the same, find the
probability of the best-case scenario; the probability of the worst-case scenario; the number of pairs you
should expect in the average case.
TASK 2:
For each of the following pairs of functions, indicate whether the first function of each of the following pairs
has a lower, same, or higher order of growth (to within a constant multiple) than the second function.
a. n(n + 1) and 2000n^2 b. 100n^2 and 0.01n^3
c. log2 n and ln n d. 2n−1 and 2n
e. (n − 1)! and n!
16
TASKS END OF LECTURE LEARNING (TELL):
TASK 3:
List the following functions according to their order of growth from the lowest to the highest:
(n − 2)!, 5 lg(n + 100)^10, 2^2n, 0.001n^4 + 3n^3 + 1, ln^2 n, 3√n, 3^n.
TASK 4:
Prove the following assertions by using the definitions of the notations involved, or disprove them by giving a
specific counterexample.
a. If t (n) ∈ O(g(n)), then g(n) ∈ Omega(t (n)).
b. theta(αg(n)) = theta(g(n)), where α >0.
c. theta(g(n)) = O(g(n)) ∩ Omega(g(n)).
TASK 5:
Use the informal definitions of O, Omega, and theta to determine whether the following assertions are true or
false.
a. n(n + 1)/2 ∈ O(n^3) b. n(n + 1)/2 ∈ O(n^2)
17
c. n(n + 1)/2 ∈ theta(n^3) d. n(n + 1)/2 ∈ Omega(n)
References
• Fundamentals of Computer Algorithms 2nd Edition (2008) by Horowitz, Sahni
and Rajasekaran
• Introduction to Algorithms 3rd Edition (2012) by Thomas H Cormen, Charles E
Lieserson, Ronald
18
THANK YOU
For queries
Email: vikash.e13657@cumail.in
07/30/2024 19