[go: up one dir, main page]

0% found this document useful (0 votes)
27 views31 pages

Asymptotic Notations and Complexity Analysis

dsa

Uploaded by

Kamal Mehta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views31 pages

Asymptotic Notations and Complexity Analysis

dsa

Uploaded by

Kamal Mehta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Data Structures

Lecture: Asymptotic Notations & Complexity


Analysis

By
Arvind Kumar
Asst. Professor,
Lovely Professional University, Punjab
Contents
• Basic Terminology
• Complexity of Algorithm
• Asymptotic Notations
• Review Questions
Basic Terminology
• Algorithm: is a finite step by step list of well-
defined instructions for solving a particular problem.

• Complexity of Algorithm: is a measure of the


amount of time or space required by an algorithm for
an input of a given size(n).

• Time and Space are two major measures of


efficiency of an algorithm.
Algorithm

Specification of Output
Specification of Input
(e.g. any sequence of natural Algorithm as a function of Input
numbers) (e.g. sequence of sorted
natural numbers)
Characteristics of Good Algorithm
• Efficient
• Running Time
• Space used

• Efficiency as a function of input size


• Size of Input
• Number of Data elements
Time-Space Tradeoff
• By increasing the amount of space for storing the
data, one may be able to reduce the time needed for
processing the data, or vice versa.
Importance of Algorithm Analysis

There are two basic requirements to solve any


particular problem. These are:
• Data
• Instructions to manipulate data i.e, Algorithm
The choice of an efficient algorithm is of great
importance, which can be made by considering the
following factors:
Factors:

• Programming Requirements of an
algorithm
• Time Requirements of an algorithm
• Space Requirements of an algorithm
Complexity of Algorithm
• Time and Space used by the algorithm are two main
measures for efficiency of any algorithm M.

• Time complexity of an algorithm quantifies the amount


of time taken by an algorithm to run as a function of the
length of input.
• Space complexity of an algorithm quantifies the amount
of space or memory taken by an algorithm to run as a
function of the length of input.
• Complexity of Algorithm is a function f(n) which gives
running time and/or space requirement of algorithm M in terms
of the size n of the input data.

• Worst Case: The maximum value of f(n) for any possible input.

• Average Case: The expected or average value of f(n).

• Best Case: Minimum possible value of f(n).


Analysis of Insertion Sort Algorithm
cost times
for j←2 to n do c1 n
key ←A[j] c2 n-1
i ←j-1 c3 n-1
while i>0 and A[i] > key c4
do A[i+1] ←A[i] c5
i-- c6
A[i+1] ← key c7 n-1
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)
Analysis of Insertion Sort
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)
• Best Case: Elements are already sorted, tj=1
running time = f(n)
• Worst Case: Elements are sorted in reverse order,
tj=j
running time = f(n2)
• Average Case: tj= j/2
running time = f(n2)
Rate of Growth
• The rate of growth of some standard functions
g(n) is:

log2n < n < nlog2n < n2 < n3 < 2n


Asymptotic Notations
• Goal: to simplify analysis of running time .

• Useful to identify how the running time of an


algorithm increases with the size of the input in
the limit.

• Asymptotic is a line that approaches a curve but


never touches.
Asymptotic Notations
• Complexity can be defined as the rate at
which the storage or time requirement
grows as a function of the problem size.
• The Asymptotic Analysis is used to simplify
the analysis of running time by removing
the details which may be affected by
hardware or compilers used.
• Example: Rounding off numbers such as
999999 = 1000000
Asymptotic Notations
• To measure the complexity of an algorithm,
various asymptotic notations can be used such
as:
• Big O Notation
• Big Omega (Ω) Notation
• Big Theta (θ) Notation
• Little Omega Notation
• Little Theta Notation
Asymptotic Notations
Special Classes of Algorithms
• Logarithmic: O(log n)
• Linear: O(n)
• Quadratic: O(n2)
• Polynomial: O(nk), k >= 1
• Exponential: O(an), a > 1
Big-Oh (O) Notation
• Asymptotic upper bound

• f(n) = O (g(n)), if there exists


constants c and n0 such that, T
i
m
e
• f(n) <= c g(n) for n >= n0

• f(n) and g(n) are functions over non- Input


value
negative integers.
• Used for Worst-case analysis.
Big-Oh (O) Notation
• Simple Rule:
Drop lower order terms and constant factors.

Example:
• 50n log n is O(n log n)
• 8n2 log n + 5 n2 + n is O(n2 log n)
Big-Omega (Ω) Notation
• Asymptotic lower bound

• f(n) = Ω (g(n)), if there


exists constants c and n0
such that,
c g(n) <= f(n) for n >= n0

• Used to describe Best-case


running time.
Big-Theta (Ө)Notation
• Asymptotic tight bound

• f(n) = Ө (g(n)), if there exists


constants c1, c2 and n0 such that,

• c1 g(n) <= f(n) <= c2 g(n) for n


>= n0

• f(n) = Ө (g(n)), iff f(n) = O(g(n)) and


f(n) = Ω (g(n))
Little-Oh (o) Notation
• Non-tight analogue of Big-Oh.

• f(n) = o (g(n)), if for every c, there exists n0


such that,
f(n) < c g(n) for n >= n0

• Used for comparisons of running times.


Analysis of Algorithms
• // Here c is a constant
for (int i = 1; i <= c; i++)
{
// some O(1) expressions
}

• O(1)/Constant Time Complexity: Time


complexity of a function (or set of statements) is
considered as O(1) if it contain loop, recursion and
call to any other function.
Analysis of Algorithms
• for (int i = 1; i <= n; i += c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i -= c)


{ // some O(1) expressions }

• O(n)/ Linear Time Complexity: Time Complexity of


a loop is considered as O(n) if the loop variables is
incremented / decremented by a constant amount.
Analysis of Algorithms
• for (int i = 1; i <=n; i += c)
{
for (int j = 1; j <=n; j += c)
{ // some O(1) expressions
}
}
• for (int i = n; i > 0; i -= c)
{
for (int j = i+1; j <=n; j += c)
{ // some O(1) expressions
} }
• O(n2)/ Quadratic Time Complexity: Time complexity of
nested loops is equal to the number of times the innermost
statement is executed.
Analysis of Algorithms
• for (int i = 1; i <=n; i *= c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i /= c)


{ // some O(1) expressions }

• O(Logn)/ Logrithmic Time Complexity: Time


Complexity of a loop is considered as O(Logn) if
the loop variables is divided / multiplied by a
constant amount.
Analysis of Algorithms
• for (int i = 2; i <=n; i = pow(i, c))
{ // some O(1) expressions }

• //Here fun is sqrt or cuberoot or any other


constant root
for (int i = n; i > 0; i = fun(i))
{ // some O(1) expressions }
• O(LogLogn) Time Complexity of a loop is
considered as O(LogLogn) if the loop variables is
reduced / increased exponentially by a constant
amount.
Analysis of Algorithms
• for (int i = 2; i*i <=n; i++))
{ // some O(1) expressions }

• O(√n) Time Complexity.


Questions
Review Questions
• When an algorithm is said to be better than the
other?

• Can an algorithm have different running times on


different machines?

• How the algorithm’ running time is dependent on


machines on which it is executed?
Review Questions
Find out the complexity:
function ()
{
if (condition)
{
for (i=0; i<n; i++) { // simple statements}
}
else
{
for (j=1; j<n; j++)
for (k=n; k>0; k--) {// simple statement}
}
}

You might also like