Asymptotic
Asymptotic
Among both fixed and variable component the variable part is important
to be determined accurately, so that the actual space requirement can be
identified for an algorithm 'A'. To identify the space complexity of any
algorithm following steps can be followed:
1. Determine the variables which are instantiated by some default
values.
2. Determine which instance characteristics should be used to
measure the space requirement and this is will be problem specific.
3. Generally the choices are limited to quantities related to the number
and magnitudes of the inputs to and outputs from the algorithms.
4. Sometimes more complex measures of the interrelationships among
the data items can used.
So in this way both the Space complexity and Time complexity can be
calculated. Combination of both complexity comprises the Performance
analysis of any algorithm and can not be used independently. Both these
complexities also helps in defining parameters on basis of which we
optimize algorithms.
For example:
1. Let us consider T(n) = an + b where a>0, If the input size 'n' is
multiplied by 'k' then the dominant term in T(n) i.e. 'an' is also
multiplied by 'k'. This produces T(kn) = k(an)+b and we can say that
T(n) has a linear order of growth.
2
2. Let us consider T(n) = an + bn + c where a>0, If the input size 'n' is
2 2
multiplied by 'k' ,this produces T(kn) = (k a)n + (kb)n + c, thus the
2
dominant term is multiplied by k and we can say that T(n) has a
quadratic order of growth. and so on for cubic, logarithmic etc.
Thus for efficiency with respect to large problems the analysis is made for
large values of the input size and dominant terms plays and important
role. The analysis that is conducted for large values of input size is called
as asymptotic analysis.
Asymptotic Notations
Execution time of an algorithm depends on the instruction set, processor
speed, disk I/O speed, etc. Hence, we estimate the efficiency of an algorithm
asymptotically.
O − Big Oh Notation
Ω − Big omega Notation
θ − Big theta Notation
o − Little Oh Notation
ω − Little omega Notation
Hence, function g(n) is an upper bound for function f(n), as g(n) grows
faster than f(n).
Example
function, f(n)=4.n3+10.n2+5.n+1𝑓(𝑛)=4.𝑛3+10.𝑛2+5.𝑛+1
Let us consider a given
Considering g(n)=n3𝑔(𝑛)=𝑛3,
O(g(n))𝑂(𝑔(𝑛)),
i.e. O(n3)𝑂(𝑛3)
Hence, the complexity of f(n) can be represented as
function, f(n)=4.n3+10.n2+5.n+1𝑓(𝑛)=4.𝑛3+10.𝑛2+5.𝑛+1.
Let us consider a given
Ω(g(n))Ω(𝑔(𝑛)),
i.e. Ω(n3)Ω(𝑛3)
Hence, the complexity of f(n) can be represented as
function, f(n)=4.n3+10.n2+5.n+1𝑓(𝑛)=4.𝑛3+10.𝑛2+5.𝑛+1
Let us consider a given
θ(g(n))𝜃(𝑔(𝑛)),
i.e. θ(n3)𝜃(𝑛3).
Hence, the complexity of f(n) can be represented as
Little Oh, o
any positive constant c>0𝑐>0 and there exists a value n0>0𝑛0>0, such
We formally define o(g(n)) (little-oh of g of n) as the set f(n) = o(g(n)) for
that 0⩽f(n)⩽c.g(n)0⩽𝑓(𝑛)⩽𝑐.𝑔(𝑛).
Example
function, f(n)=4.n3+10.n2+5.n+1𝑓(𝑛)=4.𝑛3+10.𝑛2+5.𝑛+1
Let us consider the same
Considering g(n)=n4𝑔(𝑛)=𝑛4,
limn→∞(4.n3+10.n2+5.n+1n4)=0lim𝑛→∞(4.𝑛3+10.𝑛2+5.𝑛+1𝑛4)=0
o(g(n))𝑜(𝑔(𝑛)),
i.e. o(n4)𝑜(𝑛4).
Hence, the complexity of f(n) can be represented as