Lec 2 Complexity Analysis
Lec 2 Complexity Analysis
Basic Steps : n2
Analysing an Algorithm
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A
• Arithmetic operations:
x = 5 * y + 4 - z;
• Array referencing:
A[j] = 5;
• Most conditional tests:
if (x < 12) ...
Analyzing Loops
F(n)= n/2;
F(n)= O(n)
Analyzing Loops
F(n)= 2n2+2n+1
F(n)= O(n2)
Analyzing Nested Loops
• Treat just like a single loop and evaluate each
level of nesting as needed:
int j,k;
for (j=0; j<n; j++)
for (k=0; k<n; k++)
sum += k+j;
SUM RULE
Analysing an Algorithm
• Loop index doesn’t vary linearly
i = 1;
while ( i < n ) {
s;
i = 2 * i;
}
Analysing an Algorithm
• Loop index doesn’t vary linearly
i = 1;
while ( i < n ) {
s;
i = 2 * i;
}
• i takes values 1, 2, 4, … until it exceeds n
i
• 1
• 1x2=2
• 2x2=22
• 22x2=23
• .
• .
• .
• 2k
i • Assume
• 1
• 1x2=2 • i>=n
• 2x2=22 i=2k
• 22x2=23 2k =n
• .
K= log2(n)
• .
• .
• 2k
Analyzing Conditional Statements
What about conditional statements such as
if (condition)
statement1;
else
statement2;
where statement1 runs in O(N) time and statement2 runs in
O(N2) time?
Growth of 5n+3
Estimated running time for different values of N:
N = 10 => 53 steps
N = 100 => 503 steps
N = 1,000 => 5003 steps
N = 1,000,000 => 5,000,003 steps
• N + N + N + …. + N (total of N times)
• N*N = N2 which is O(N2)
• 1 + 2 + 4 + … + 2N
• 2N+1 – 1 = 2 x 2N – 1 which is O(2N )
106 instructions/sec, runtimes
f
• Basically, we want to find a
n0
function g(n) that is eventually
always bigger than f(n) n
Big-Oh Notation (§3.4)
10,000
• Given functions f(n) 3n
43
Big Oh Notation
If f(N) and g(N) are two complexity functions, we say
f(N) = O(g(N))
*big-O-Upper bound
Comparing Functions
• As inputs get larger, any algorithm of a
smaller order will be more efficient than an
algorithm of a larger order
0.05 N2 = O(N2)
Time (steps)
3N = O(N)
Input (size)
N = 60
Big Omega Notation
Big Theta Notation
Asymptotic notation
• Example:
• f(n) = 3n5+n4 = (n5)
Polynomial and Intractable Algorithms
• Polynomial Time complexity
• An algorithm is said to be polynomial if it is
O( nd ) for some integer d
• Polynomial algorithms are said to be efficient
• They solve problems in reasonable times!
• Intractable algorithms
• Algorithms for which there is no known
polynomial time algorithm
• We will come back to this important class later
Asymptotic order of growth
=
(g(n)), functions that grow at the same rate as g(n)
g(n)
<=
O(g(n)), functions that grow no faster than g(n)
Performance Classification
f(n) Classification
1 Constant: run time is fixed, and does not depend upon n. Most instructions are
executed once, or only a few times, regardless of the amount of information being
processed
log n Logarithmic: when n increases, so does run time, but much slower. When n
doubles, log n increases by a constant, but does not double until n increases to n2.
Common in programs which solve large problems by transforming them into smaller
problems.
n Linear: run time varies directly with n. Typically, a small amount of processing is
done on each element.
n log n When n doubles, run time slightly more than doubles. Common in programs which
break a problem down into smaller sub-problems, solves them independently, then
combines solutions
n2 Quadratic: when n doubles, runtime increases fourfold. Practical only for small
problems; typically the program processes all pairs of input (e.g. in a double nested
loop).
n3 Cubic: when n doubles, runtime increases eightfold
2n Exponential: when n doubles, run time squares. This is often the result of a natural,
“brute force” solution.
Size does matter
N log2N 5N N log2N N2 2N
8 3 40 24 64 256
16 4 80 64 256 65536
32 5 160 160 1024 ~109
64 6 320 384 4096 ~1019
128 7 640 896 16384 ~1038
256 8 1280 2048 65536 ~1076
A Display of the Growth of Functions
Commonly Used in Big-O Estimates
Typical Big O Functions – "Grades"
Function Common Name
N! factorial Running
time grows
2N Exponential 'quickly' with
Nd, d > 3 Polynomial more input.
N3 Cubic
N2 Quadratic
N N N Square root N
N log N N log N
N Linear
Running
N Root - n time grows
log N Logarithmic 'slowly' with
more input.
1 Constant
Review of Three Common Sets
These bounds hold for all inputs beyond some threshold n0.
Properties of the O notation
• Constant factors may be ignored
" k > 0, kf is O( f)
• Higher powers grow faster
• nr is O( ns) if 0 r s
Fastest growing term dominates a sum
• If f is O(g), then f + g is O(g)
eg an4 + bn3 is O(n4 )
Polynomial’s growth rate is determined by leading
term
• If f is a polynomial of degree d,
then f is O(nd)
Properties of the O notation
• f is O(g) is transitive
• If f is O(g) and g is O(h) then f is O(h)
• Product of upper bounds is upper bound for the
product
• If f is O(g) and h is O(r) then fh is O(gr)
• Exponential functions grow faster than powers
• nk is O( bn ) " b > 1 and k 0
e.g. n20 is O( 1.05n)
• Logarithms grow more slowly than powers
• logbn is O( nk) " b > 1 and k > 0
e.g. log2n is O( n0.5)
Important!
Properties of the O notation
• All logarithms grow at the same rate
• logbn is O(logdn) " b, d > 1