[go: up one dir, main page]

0% found this document useful (0 votes)
22 views24 pages

L02 AsymptoticAnalysis I

This document discusses asymptotic analysis and big O notation. It defines asymptotic analysis as analyzing algorithms by focusing on their growth rates as input size increases, ignoring constant factors. The key notations introduced are O, Ω, and Θ. O notation provides an asymptotic upper bound, Ω a lower bound, and Θ a tight bound. Examples demonstrate determining if functions grow at specific rates using these notations. The goal of asymptotic analysis is to compare algorithms' efficiencies as problems scale.

Uploaded by

Akash Sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views24 pages

L02 AsymptoticAnalysis I

This document discusses asymptotic analysis and big O notation. It defines asymptotic analysis as analyzing algorithms by focusing on their growth rates as input size increases, ignoring constant factors. The key notations introduced are O, Ω, and Θ. O notation provides an asymptotic upper bound, Ω a lower bound, and Θ a tight bound. Examples demonstrate determining if functions grow at specific rates using these notations. The goal of asymptotic analysis is to compare algorithms' efficiencies as problems scale.

Uploaded by

Akash Sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Asymptotic Analysis - I

Instructor: Ashok Singh Sairam


Lecture Plan
• Running Time
▪ Order of Growth
• Asymptotic Analysis
▪ Theta (Θ) notation
▪ Big Oh (O) notation
▪ Omega (Ω) notation
▪ Little o (o) notation
▪ Little omega (𝜔) notation

MA512: Data Structures and Algorithms


2
How to calculate running time?
• The running time of an algorithm typically grows
with the input size
▪ Idea: Analyze running time as a function of input size
• Even on inputs of the same size running time can
be different
• Idea: Analyze running time in the
▪ Best Case : Minimum running time of all the case
▪ Average Case: Occurs in most cases
▪ Worst Case: Upper bound

MA512: Data Structures and Algorithms


3
Running Time
• Best case running time is
useless
• Average case time is
useful but difficult to
determine.
• We focus primarily on the
worst case running time.
▪ Easier to analyze

MA512: Data Structures and Algorithms


4
Constant factors
• The growth rate is not affected by
▪ constant factors or
▪ lower-order terms
• Examples
▪ 102n + 105 is a linear function
▪ 105n2 + 108n is a quadratic function
• How do we get rid of the constant factors to focus
on the essential part of the running time?

MA512: Data Structures and Algorithms


5
Asymptotic Efficiency
• To compare two algorithms with running times
f(n) and g(n), characterize how fast each function
grows (Hint: Consider only the order of growth )
▪ Ignore the actual cost of each statement
▪ Ignore the abstract constants
▪ Consider only the leading term
• Ignore leading term’s constant

• Algorithms that are asymptotically more efficient


will be the best choice for all but small inputs

MA512: Data Structures and Algorithms


6
Order of Growth
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have


the same order of growth

7
Order of growth: visualization

MA512: Data Structures and Algorithms


8
Asymptotic Notation: Theta
• Given functions f(n) and g(n), we say that f(n) is
Θ(g(n)) if there are positive constants c1 and c2
and n0 such that
0 ≤ 𝑐1 𝑔 𝑛 ≤ 𝑓 𝑛 ≤ 𝑐2 𝑔 𝑛 , ∀𝑛 ≥ 𝑛0
▪ The function 𝑓(𝑛) ∈ Θ 𝑔 𝑛 , we usually write
𝑓 𝑛 = Θ(𝑔 𝑛 )
Here the equality sign means “is a member of”
▪ We say 𝑔(𝑛) is asymptotically tight bound for
𝑓(𝑛)

MA512: Data Structures and Algorithms


9
Example: Θ notation
1 2
• Example: Show that 2
𝑛 − 3𝑛 = Θ(𝑛2 )
• Solution: Find values of c1, c2 and n0 s.t.
1 2
𝑐1 𝑛 ≤ 𝑛 − 3𝑛 ≤ 𝑐2 𝑛2
2
2
1 3
• Divide by 𝑛2 , 𝑐1 ≤ − ≤ 𝑐2
2 𝑛
1
• RHS of inequality will hold for 𝑐2 =
2
• LHS of inequality will hold for n≥7 and by choosing
1 1
𝑐1 ≥ that is choose n0=7 and 𝑐1 =
14 14

MA512: Data Structures and Algorithms


10
Graph of Θ

MA512: Data Structures and Algorithms


11
Asymptotic Notation: big Oh
• Given functions f(n) and g(n), we say that f(n) is
Ο(g(n)) if there exist positive constants c and n0
such that
0 ≤ 𝑓 𝑛 ≤ 𝑐𝑔 𝑛 , ∀𝑛 ≥ 𝑛0

▪ Since we do not have a lower bound, 𝑓 𝑛 =


𝑎𝑛 + 𝑏 is also a member of Ο(𝑛2 ), i.e. 𝑛 = Ο(𝑛2)

MA512: Data Structures and Algorithms


12
Example: O notation
• Example: Show that 5+.001𝑛3 +.025𝑛 = 𝑂(𝑛3 )
• Solution: Find values of c and n0 s.t.
5+.001𝑛3 +.025𝑛 ≤ 𝑐𝑛3

5 .025
• Divide by 3
𝑛 , 3 + .001 + 2 ≤𝑐
𝑛 𝑛
• RHS of inequality will hold for n0=1 and c ≥ 6

MA512: Data Structures and Algorithms


13
Graph of Ο (big Oh)

MA512: Data Structures and Algorithms


14
big Oh: Some additional points
• Advantage of Ο is that it helps to clean up things
𝑓 𝑛 = 𝑛3 + 4𝑛2 + 100𝑛 − 1000
can be written as
𝑓 𝑛 = 𝑛3 + Ο(𝑛2)
Here Ο(𝑛2) means there exists some function
ℎ 𝑛 = Ο(𝑛2) such that ℎ 𝑛 = 4𝑛2+100𝑛−1000

MA512: Data Structures and Algorithms


15
Asymptotic Notation: big Omega
• Given functions f(n) and g(n), we say that f(n) is
Ω(g(n)) if there exist positive constants c and n0
such that
0 ≤ 𝑐𝑔 𝑛 ≤ 𝑓 𝑛 , ∀𝑛 ≥ 𝑛0

▪ Example 𝑛 = Ω(𝑙𝑜𝑔𝑛) , 6𝑛2 = Ω(𝑛)


▪ Θ(𝑔 𝑛 = Ο(𝑔 𝑛 ) ∩ Ω(𝑔 𝑛 )

MA512: Data Structures and Algorithms


16
Example: Ω notation
𝑛
• Example: Show that 100
= Ω(𝑛)
• Solution: Find values of c and n0 s.t.
𝑛
cn ≤
100
1
• Divide by n, 𝑐 ≤
100
1
• LHS of inequality will hold for n0=1 and c =
200

MA512: Data Structures and Algorithms


17
Graph of Ω

MA512: Data Structures and Algorithms


18
Asymptotic Notation: Intuitive
• O notation: asymptotic “less than”:
▪ f(n)=O(g(n)) implies: f(n) “≤” g(n)

•  notation: asymptotic “greater than”:


▪ f(n)=  (g(n)) implies: f(n) “≥” g(n)

•  notation: asymptotic “equality”:


▪ f(n)=  (g(n)) implies: f(n) “=” g(n)

19
Theorem
• For any two functions 𝑓(𝑛) and 𝑔 𝑛 , we have
𝑓 𝑛 = Θ(𝑔 𝑛 iff 𝑓 𝑛 = Ο(𝑔 𝑛 ) and 𝑓 𝑛 =
Ω(𝑔 𝑛 )

MA512: Data Structures and Algorithms


20
Asymptotic Notation: Little Oh
• The asymptotic upper bound given by Ο may or
may not be asymptotically tight
5𝑛2 = Ο(𝑛2) is asymptotically tight
5𝑛 = Ο 𝑛2 is not
• 𝜊-notation denote an upper bound that is not
asymptotically tight

MA512: Data Structures and Algorithms


21
Asymptotic Notation: Little Oh (contd.)
• Given functions f(n) and g(n), we say that f(n) is
𝜊(g(n)) if for any positive constants c>0 ∃ constant
n0>0 such that
0 ≤ 𝑓 𝑛 < 𝑐𝑔 𝑛 , ∀𝑛 ≥ 𝑛0
5𝑛 = 𝜊(𝑛2) but 5𝑛2 ≠ o 𝑛2 is not
• Intuitively f(n) becomes insignificant relative to
g(n) as n approaches infinity

MA512: Data Structures and Algorithms


22
Asymptotic Notation: Little omega
• The asymptotic upper bound given by Ω may or
may not be asymptotically tight
• For example,6𝑛2 = Ω(𝑛) is asymptotically tight but
6𝑛2 = Ω(𝑛2 ) is not

• Analogous to 𝜊-notation, 𝜔-notation denote a


lower bound that is not asymptotically tight

MA512: Data Structures and Algorithms


23
Asymptotic Notation: Little Omega (contd.)
• Given functions f(n) and g(n), we say that f(n) is
𝜔(g(n)) if for any positive constants c>0 ∃
constant n0>0 such that
0 ≤ 𝑐𝑔 𝑛 < 𝑓 𝑛 , ∀𝑛 ≥ 𝑛0
1 2 1 2
𝑛 = 𝜔(𝑛) but 𝑛 = 𝜔(𝑛2)is not
2 2
• Intuitively f(n) becomes arbitrarily large relative to
g(n) as n approaches infinity

MA512: Data Structures and Algorithms


24

You might also like