Asymptotic Notations by Codewithharry
Asymptotic Notations by Codewithharry
Asymptotic Notations by Codewithharry
Asymptotic notations are used to represent the complexities of algorithms for asymptotic
analysis. These notations are mathematical tools to represent the complexities. There are
three notations that are commonly used.
Asymptotic analysis of an algorithm refers to defining the mathematical
boundation/framing of its run-time performance. Using asymptotic analysis, we can very
well conclude the best case, average case, and worst case scenario of an algorithm.
Asymptotic analysis is input bound i.e., if there's no input to the algorithm, it is concluded
to work in a constant time. Other than the "input" all other factors are considered
constant.
Asymptotic analysis refers to computing the running time of any operation in
mathematical units of computation. For example, the running time of one operation is
computed as f(n) and may be for another operation it is computed as g(n2). This means
the first operation running time will increase linearly with the increase in n and the
running time of the second operation will increase exponentially when n increases.
Similarly, the running time of both operations will be nearly the same if n is significantly
small.
Usually, the time required by an algorithm falls under three types −
• Best Case − Minimum time required for program execution.
• Average Case − Average time required for program execution.
• Worst Case − Maximum time required for program execution.
Big Oh Notation
Big-Oh (O) notation gives an upper bound for a function f(n) to within a constant factor.
Little o Notations
There are some other notations present except the Big-Oh, Big-Omega and Big-Theta notations. The little o notation is one of them.
Little o notation is used to describe an upper bound that cannot be tight. In other words, loose upper bound of f(n).
Little ω Notations
Another asymptotic notation is little omega notation. it is denoted by (ω).
Little omega (ω) notation is used to describe a loose lower bound of f(n).
Big Oh Notation
Big-Oh (O) notation gives an upper bound for a function f(n) to within a constant factor.
We write f(n) = O(g(n)), If there are positive constantsn0 and c such that, to the right of
n0 the f(n) always lies on or below c*g(n).
O(g(n)) = { f(n) : There exist positive constant c and n0 such that 0 ≤ f(n) ≤ c g(n), for all
n ≥ n0}
Ο(f(n)) = { g(n) : there exists c > 0 and n0 such that
f(n) ≤ c.g(n) for all n >= n0. }
We write f(n) = Ω(g(n)), If there are positive constants n0 and c such that, to the right of
n0 the f(n) always lies on or above c*g(n).
Ω(g(n)) = { f(n) : There exist positive constant c and n0 such that 0 ≤ c g(n) ≤ f(n), for all
n ≥ n0}
We write f(n) = Θ(g(n)), If there are positive constantsn0 and c1 and c2 such that, to the
right of n0 the f(n) always lies between c1*g(n) and c2*g(n) inclusive.
Θ(g(n)) = {f(n) : There exist positive constant c1, c2 and n0 such that 0 ≤ c1 g(n) ≤ f(n) ≤
c2 g(n), for all n ≥ n0}
constant − Ο(1)
logarithmic − Ο(log n)
linear − Ο(n)
quadratic − Ο(n2)
cubic − Ο(n3)
polynomial − nΟ(1)
exponential − 2Ο(n)