Analyzing Recursive Algorithms! Analyzing Recursive Algorithms!
Analyzing Recursive Algorithms! Analyzing Recursive Algorithms!
A recursive algorithm can often be described by a recurrence For recursive algorithms such as computing the factorial of n,
equation that describes the overall runtime on a problem of we get an expression like the following:
size n in terms of the runtime on smaller inputs.
T(n) = 1! if n = 0
For divide-and-conquer algorithms, we get recurrences like: T(n-1) + D(n) + C(n) otherwise
!(1) if n " c
T(n) =
aT(n/b) + D(n) + C(n) otherwise
where where
•! a = number of subproblems we divide the problem into •! n-1 = size of the subproblems (in terms of n)
•! n/b = size of the subproblems (in terms of n) •! D(n) = time to divide the size n problem into subproblems (O(1))
•! D(n) = time to divide the size n problem into subproblems •! C(n) = time to combine the subproblem solutions to get the
•! C(n) = time to combine the subproblem solutions to get the answer for the problem of size n (O(1))
answer for the problem of size n
= !(lgn) ?? Which algorithm ?? What is the running time of this algorithm for an input of size n? Are
there best and worst cases input instances?
Mathematical induction with a good guess! Mathematical induction with a good guess!
Suppose T(n) = 1 if n = 2, and T(n) = T(n/2) + !(1) if n = 2k, " Suppose T(n) = 1 if n = 1, and T(n) = T(n-1) + !(n) for n > 1.!
for k > 1.! Show T(n) = O(n2) by induction.!
Show T(n) = lgn by induction on the exponent k.! Basis: When n = 1. T(1) = 12 = 1.!
Basis: When k = 1, n = 2. T(2) = lg2 = 1.! IHOP: Assume T(k) = k2 for all n < k. "
IHOP: Assume T(2k) = lg2k for some constant k >1. " Inductive step: Show T(k) = k2.!
Inductive step: Show T(2k+1) = lg(2k+1) = k+1.! T(k) ! =! T(k-1) + k /* given */!
! =! (k-1)2 + k /* by inductive hypothesis */!
T(2k+1) =! T(2k) +1 /* given */! ! =! k2 - k + 1!
! =! (lg2k) + 1 /* by inductive hypothesis */! ! "! k2 for k > 1!
! =! k + 1 /* lg2k = k */!
Solving Recurrences: Master Method (§4.3)! Solving Recurrences: Master Method!
The master method provides a 'cookbook' method for solving
recurrences of a certain form. Intuition: Compare f(n) with !(nlogba).!
Master Theorem: Let a ! 1 and b > 1 be constants, let f(n) be a
function, and let T(n) be defined on nonnegative integers as: case 1: f(n) is "polynomially smaller than" !(nlogba)!
case 2: f(n) is "asymptotically equal to" !(nlogba)!
T(n) = aT(n/b) + f(n) ! case 3: f(n) is "polynomially larger than" !(nlogba)!
Recall, a is the number of subproblems and n/b is the size of each
subproblem.
What is logba? The number of times we divide a by b to reach O(1).!
lgkn = (lgn)k
lglg(n) = lg(lgn)
T(n) =!
c! ! ! if n = 1!
Case 3 requires us to also show a(f(n/b))" c(f(n)), the
2T(n/2) + cn! ! otherwise! “regularity” condition.
Recurrence for worst-case running time for Merge-Sort! The regularity condition always holds whenever f(n) = nk and f(n)
log a+$
= %(n b ) , so we don’t need to check it when f(n) is a polynomial.
Solving Recurrences: Master Method (§4.3)! Solving Recurrences: Master Method (§4.3)!
These 3 cases do not cover all the possibilities for f(n). A more general version of Case 2 follows:
There is a gap between cases 1 and 2 when f(n) is smaller than nlogba, T(n) = !(nlogbalgk+1n) if f(n) = !(nlogbalgkn) for k ! 0
but not polynomially smaller.
This case covers the gap between cases 2 and 3 in which f(n) is
There is a gap between cases 2 and 3 when f(n) is larger than nlogba, but larger than nlogba by only a polylog factor. We’ll see an example of
not polynomially larger. this type of recurrence in class.
If the function falls into one of these 2 gaps, or if the regularity condition
can’t be shown to hold, then the master method can’t be used to solve
the recurrence.
c) T(n) = 16T(n/4) + n2
Example: T(n) = 7T(n/2) + n2!
d) T(n) = 5T(n/2) + n2
•! a = 7, b = 2, f(n) = n2, nlogba = nlog27 = n2+ $!
•! compare f(n) = n2 with n2+ $ ! e) T(n) = 5T(n/2) + n3
! n2 = O(n2+$) (so f(n) is polynomially smaller)!
•! Case 1 holds and T(n) = !(nlog27)