Asymptotic Notations
Asymptotic Notations
Asymptotic Notations
Asymptotic Notations
Asymptotic notations are mathematical tools to represent the
time complexities of the algorithms.
1< log n < √n < n < n log n < n2 < n3 < … < 2n < 3n < … < nn
lower bound upper bound
So, we can say, f(n) >= Ώ(g(n)) for c=1 and n0 >=1
= Ώ(n2)
Asymptotic Notations: θ notation
Formal Definition:
f(n)= θ(g(n)) if there exist positive
constant c1, c2, and n0 such that-
c1*g(n) <=f(n)<= c2*g(n) for all
values of n>= n0
Same term for both lower and upper, so, we can get
the tight bound.
So, f(n) = O(n2 log n), f(n) = Ώ(n2 log n), f(n) = θ(n2 log n)
Asymptotic Notations
f(n) = O(nn)
f(n) = Ώ (1)
Asymptotic Notations: Example
f(n)= log n!
We can write as-
log(1*1*1*…*1) <= log (1*2*3…*n) <= log (n*n*n….*n*n)
1<= log n! <= log nn
1<= log n! <= nlog n
This also has no common terms for lower and upper bounds, so for
this also no tight bounds exists. In such cases upper bounds are used
for asymptotic bounds.
Please note, tight bound does not exist for the factorial functions.
Theta notations always tells the best value/ complexity while others
may not give closest value/ complexity. This means Big O and Big
Omega is used when we cannot find the exact figure/ complexity.
Properties of Asymptotic Notations
1. If f(n) is O(g(n)) then a* f(n) is also O(g(n)).
for e.g., f(n) = 2n2 +3 = O(n2) then
5*f(n)= 5*(2n2 +3)= 10n2 +15 => O(n2)
The same property also satisfy for Ώ (n) and θ(n).
5. Transpose Symmetric
If f(n) is O(g(n)) then g(n) is Ώ (f(n)).
for e.g., f(n) = n +3 = O(n),
g(n) =2n2 +1 =O(n2)
Then, n= O(n2) and n2 = Ώ(n)
This property satisfy for big O and big omega.
Properties of Asymptotic Notations
6. If f(n) = O(g(n)) then f(n) =Ώ(g(n)), i.e.,
g(n) <= f(n) <= g(n), then
f(n)=θ(g(n))
Find the asymptotic bound for the following
i. If f(n)= O (g(n)), and d(n) =O(e(n)), then f(n)+ d(n)=?
Assume f(n)= n and d(n)= n2 then
f(n)+ d(n)= n+ n2 = O(n2)
If we take f(n)= n2 and d(n)= n then
f(n)+ d(n)= n2+ n = O(n2) = O(max(O(g(n), e(n)))
This means the asymptotic bound will be equal to the function having
greater bound.
It can be observed from the analysis that f2(n) will always greater than
f1(n) for n>=10000.
Best, Worst, and Average case Analysis
Let's take an example of linear search
A= 7 10 12 9 21 15 6 8
If we want to search a key =21, 5 comparison will be needed
For any key such as 75, it is not present in the list so search will be unsuccessful.
As we are only interested in successful search so, check all possible cases for
successful search.
Case 1: if the key is the first element (7) of the list [Best case]
Best case time = 1 (constant)=> B(n)=1
Case 2: if the key is the last element (8) of the list [Worst Case]
Worst case time = n => w(n)=n
Case 3: Average case time= all possible case time/no. of cases
It is difficult to find the average case time for any algorithm and generally, it is
equivalent to worst case time.
= 1+2+3+… + n / n => [n(n+1)/2]/n= (n+1)/2
A(n) = (n+1)/2
Best, Worst, and Average case Analysis
Analyzing Asymptotic notations
Best case time
B(n)=1
B(n)=O(1)
B(n) = Ώ(1)
So, B(n) = θ(1)
Worst case time
w(n)=n
B(n)=O(n)
B(n) = Ώ(n)
So, B(n) = θ(n)
There is no fixed notation to represent best case or worst-case time. This can be
represented by any of the asymptotic notations. It is not true that best case can
always be represented by θ and worst case by O.