[go: up one dir, main page]

0% found this document useful (0 votes)
48 views16 pages

Asymptotic Notation Demo

The document discusses asymptotic notation used to analyze algorithms such as O-notation for upper bounds, Ω-notation for lower bounds, and Θ-notation for tight bounds. It provides examples of common functions and their asymptotic bounds including polynomials, logarithms, exponentials, and factorials. Properties of asymptotic notation such as transitivity and additivity are covered.

Uploaded by

Anand Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views16 pages

Asymptotic Notation Demo

The document discusses asymptotic notation used to analyze algorithms such as O-notation for upper bounds, Ω-notation for lower bounds, and Θ-notation for tight bounds. It provides examples of common functions and their asymptotic bounds including polynomials, logarithms, exponentials, and factorials. Properties of asymptotic notation such as transitivity and additivity are covered.

Uploaded by

Anand Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Algorithm Design and Analysis

LECTURE 3
• Asymptotic Notation
• Basic Data Structures

Adam Smith
8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Asymptotic notation
O-notation (upper bounds):
We write f(n) = O(g(n)) if there
exist constants c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.

EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2)


funny, “one-way”
functions, equality
not values
9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Set Definition

O(g(n)) = { f(n) : there exist constants


c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }

EXAMPLE: 2n2 ∈ O(n3)


(Logicians: λn.2n2 ∈ O(λn.n3), but it’s
convenient to be sloppy, as long as we
understand what’s really going on.)
9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Examples
•  106 n3+ 2n2 -n +10 = O(n3)

•  n½ + log(n) = O(n½)

•  n (log(n) + n½) = O(n3/2)

•  n = O(n2)

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).

Ω(g(n)) = { f(n) : there exist constants


c > 0, n0 > 0 such
that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }

EXAMPLE: (c = 1, n0 = 16)
9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Ω-notation (lower bounds)
• Be careful: “Any comparison-based sorting algorithm
requires at least O(n log n) comparisons.”
– Meaningless!
– Use Ω for lower bounds.

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Θ-notation (tight bounds)

Θ(g(n)) = O(g(n)) ∩ Ω(g(n))

EXAMPLE:

Polynomials are simple:


ad nd + ad–1nd–1 +  + a1n + a0 = Θ(nd)
9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
o-notation and ω-notation
O-notation and Ω-notation are like ≤ and ≥.
o-notation and ω-notation are like < and >.

o(g(n)) = { f(n) : for any constant c > 0,


there is a constant n0 > 0
such that 0 ≤ f(n) < cg(n)
for all n ≥ n0 }

EXAMPLE: 2n2 = o(n3) (n0 = 2/c)


9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
o-notation and ω-notation
O-notation and Ω-notation are like ≤ and ≥.
o-notation and ω-notation are like < and >.

ω(g(n)) = { f(n) : for any constant c > 0,


there is a constant n0 > 0
such that 0 ≤ cg(n) < f(n)
for all n ≥ n0 }

EXAMPLE: (n0 = 1+1/c)


9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Common Functions: Asymptotic Bounds

•  Polynomials. a0 + a1n + … + adnd is Θ(nd) if ad > 0.


•  Polynomial time. Running time is O(nd) for some
constant d independent of the input size n.
•  Logarithms. log a n = Θ(log b n) for all constants a, b > 0.
can avoid specifying the base
log grows slower than every polynomial

For every x > 0, log n = O(nx).


Every exponential grows faster than every polynomial

•  Exponentials. For all r >1! and all d > 0, n d = O(rn).


√ n "n
•  Factorial. n! = ( 2πn) (1 + o(1)) = 2Θ(n log n)
e
grows faster than every exponential
8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Sort by asymptotic order of
growth
a)  n log(n) 1.  n1/log(n)
b)  2.  log(n)
c)  log(n) 3. 
d)  n2 4.  n
e)  2n 5.  n log(n) = Θ(log(n!))
f)  n 6. 
g)  n! 7.  n2
h)  n1,000,000 8.  n1,000,000
i)  n1/log(n) 9.  2n
j)  log(n!) 10.  n!

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Conventions for formulas
Convention: A set in a formula represents
an anonymous function in the set.
EXAMPLE: f(n) = n3 + O(n2)
(right-hand side) means
f(n) = n3 + h(n)
for some h(n) ∈ O(n2) .

9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Convention for formulas
Convention: A set in a formula represents
an anonymous function in the set.
EXAMPLE: n2 + O(n) = O(n2)
(left-hand side) means
for any f(n) ∈ O(n):
n2 + f(n) = h(n)
for some h(n) ∈ O(n2) .

9/15/08
S. Raskhodnikova and A. Smith. Based on notes by E. Demaine and C. Leiserson
Properties
• Transitivity.
– If f = O(g) and g = O(h) then f = O(h).
– If f = Ω(g) and g = Ω(h) then f = Ω(h).
– If f = Θ(g) and g = Θ(h) then f = Θ(h).
• Additivity.
– If f = O(h) and g = O(h) then f + g = O(h).
– If f = Ω(h) and g = Ω(h) then f + g = Ω(h).
– If f = Θ(h) and g = O(h) then f + g = Θ(h).

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Exercise: Show that log(n!) = Θ(n log n)
√ ! n "n
•  Stirling’s formula: n! = ( 2πn) (1 + o(1))
e

log(n!) = log( 2πn) + n log(n) − n log(e) + log(1 + o(1))
! "# $
=log(1)+o(1)
since log is continuous
% log(2πn) &
= n log(n) − log(e) + + o(1)
! "# n $
constant+o(1)

= n(log(n) − O(1)) + o(1)


= n log n(1 − O( log1 n )) + o(1)
= n log n(1 ± o(1)) = Θ(n log n)

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Overview
Notation … means … Think… E.g. Lim f(n)/g(n)
f(n)=O(n) 9 c>0, n0>0, 8 n > n0 : Upper 100n2 If it exists, it
0 · f(n) < cg(n) bound = O(n3) is < 1

f(n)=Ω(g(n)) 9 c>0, n0>0, 8 n > n0 : Lower n100 If it exists, it


0 · cg(n) < f(n) bound = Ω(2n) is > 0
f(n)=Θ(g(n)) both of the above: f=Ω Tight bound log(n!) If it exists, it
(g) and f = O(g) = Θ(n log n) is > 0 and <
1
f(n)=o(g(n)) 8 c>0, n0>0, 8 n > n0 : Strict upper n2 = o(2n) Limit exists,
0 · f(n) < cg(n) bound =0
f(n)=ω(g(n)) 8 c>0, n0>0, 8 n > n0 : Strict lower n2 Limit exists,
0 · cg(n) < f(n) bound = ω(log n) =1

8/29/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne

You might also like