[go: up one dir, main page]

0% found this document useful (0 votes)
78 views4 pages

Analyzing Recursive Algorithms! Analyzing Recursive Algorithms!

Recursive algorithms can often be described by recurrence relations that define the runtime T(n) in terms of the runtime on smaller inputs. There are three main methods for solving recurrences: backward substitution, making an educated guess and proving it with induction, and applying the Master Theorem. Backward substitution involves expanding the recurrence by substituting prior terms, revealing a pattern that can be written as a summation. Making a guess and proving it involves assuming a form for T(n) and showing it holds for all n. The Master Theorem provides a formula for certain divide-and-conquer recurrences.

Uploaded by

addmaths07
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views4 pages

Analyzing Recursive Algorithms! Analyzing Recursive Algorithms!

Recursive algorithms can often be described by recurrence relations that define the runtime T(n) in terms of the runtime on smaller inputs. There are three main methods for solving recurrences: backward substitution, making an educated guess and proving it with induction, and applying the Master Theorem. Backward substitution involves expanding the recurrence by substituting prior terms, revealing a pattern that can be written as a summation. Making a guess and proving it involves assuming a form for T(n) and showing it holds for all n. The Master Theorem provides a formula for certain divide-and-conquer recurrences.

Uploaded by

addmaths07
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Analyzing Recursive Algorithms! Analyzing Recursive Algorithms!

A recursive algorithm can often be described by a recurrence For recursive algorithms such as computing the factorial of n,
equation that describes the overall runtime on a problem of we get an expression like the following:
size n in terms of the runtime on smaller inputs.
T(n) = 1! if n = 0
For divide-and-conquer algorithms, we get recurrences like: T(n-1) + D(n) + C(n) otherwise
!(1) if n " c
T(n) =
aT(n/b) + D(n) + C(n) otherwise
where where
•! a = number of subproblems we divide the problem into •! n-1 = size of the subproblems (in terms of n)
•! n/b = size of the subproblems (in terms of n) •! D(n) = time to divide the size n problem into subproblems (O(1))
•! D(n) = time to divide the size n problem into subproblems •! C(n) = time to combine the subproblem solutions to get the
•! C(n) = time to combine the subproblem solutions to get the answer for the problem of size n (O(1))
answer for the problem of size n

Solving Recurrences (Ch. 4)! Solving recurrence for n!!


Algorithm F(n)
We will use the following 3 methods to solve recurrences T(n) = T(n-1) + 1 subst T(n-1) = T(n-2) + 1!
Input: a positive integer n
1.! Backward Substitution: involves seeing a pattern and converting it to a = [T(n-2) + 1] + 1 = T(n-2) + 2 !
summation. Output: n! ! ! subst T(n-2) = T(n-3) + 1!
2.! Make a guess and prove using induction. 1.! if n=0 =[T(n-3) + 1] + 2 = T(n-3) + 3…!
2.! then return 1 …!
3.! Apply the "Master Theorem": If the recurrence has the form
=T(n-i)+i = … = T(n-n)+n = 0 + n!
T(n) = aT(n/b) + f(n) 3.! else
Therefore, this algorithm has linear running !
then there is a formula that can (often) be applied, 4.! return F(n-1) * n time.!
given in Section 4-3.
T(n) = T(n-1) + 1
To make the solutions simpler, we will T(0) = 0
•! assume base cases are constant, i.e., T(n) = !(1) for n small enough.
We can solve this recurrence (ie, find an expression of the running time
that is not given in terms of itself) using a method known as backward
substitution.

Solving Recurrences: Backward Substitution! Solving Recurrences: Backward Substitution!

Example: T(n) = 2T(n/2) + n (look familiar?) Example: T(n) = 4T(n/2) + n

T(n) = 2T(n/2) + n T(n) = 4T(n/2) + n


= 2[2T(n/4) + n/2] + n expand T(n/2) = 4[4T(n/4) + n/2] + n /* expand T(n/2) */
= 4T(n/4) + n + n simplify (4 = 22) = 16T(n/4) + 2n + n /* simplify 16 = 42*/
= 4[2T(n/8) + n/4] + n + n expand T(n/4) = 16[4T(n/8) + n/4] + 2n + n /* expand T(n/4) */
= 8T(n/8) + n + n + n simplify…notice a pattern? = 64T(n/8) + 4n + 2n + n /* simplify 64 = 43*/
= ... ... = ... ...

= 2lgnT(n/2lgn) + . . . + n + n + n after lgn iterations = 4lgnT(n/2lgn) + . . . + 4n + 2n + n /* after lgn iterations */


lg n"1

= c2lgn + nlgn = c4lgn + n #2 k


= 2 0 + 21 + ...+ 2 lg n"1 /* convert to summation */
k= 0
lgn lg2 lg4 lgn lgn lg4 2
= cn + nlgn 2 =n =n = cn + n (2 - 1) /* 4 = n = n */
2 lgn lg2
= cn + n(n - 1) /* 2 = n = n */
= !(nlgn) = !(n2)
!
Solving Recurrences: Backward Substitution! Binary Search (non-recursive)!
Example: T(n) = T(n/2) + 1 Algorithm Binary-Search(A[1…n], k)
Input: a sorted array A of n comparable items and search key k
T(n) = T(n/2) + 1 Output: Index of array’s element that is equal to k or -1 if k not found
= [T(n/4) + 1] + 1 /* expand T(n/2) */
= T(n/4) + 2 /* simplify */ 1.! l = 1; r =n
= [T(n/8) + 1] + 2 /* expand T(n/4) */ 2.! while l <= r do
= T(n/8) + 3 /* simplify */ 3.! m = floor of (l + r)/2
= ... .. . 4.! if k = A[m] return m
lgn lg2 5.! else if k < A[m] r = m – 1
= T(n/2lgn) + lgn /* 2 =n = n */
6.! else l = m + 1
= c + lgn 7.! return -1

= !(lgn) ?? Which algorithm ?? What is the running time of this algorithm for an input of size n? Are
there best and worst cases input instances?

Binary Search (recursive)! Guessing an Upper Bound!


Algorithm Binary-Search-Rec(A[1…n], k, l, r) Give an upper bound on the recurrence: T(n) = 2T(#n/2$) + n.!
Input: a sorted array A of n comparable items, search key k, leftmost Guess T(n) = O(nlgn) by showing that T(n) " cnlgn for some c > 0.!
and rightmost index positions in A
Output: Index of array’s element that is equal to k or -1 if k not found
1.! if (l > r) return -1
Assume T("n/2#) " c "n/2# lg("n/2#). #
2.! else
T(n) ! "! 2(c"n/2#lg("n/2#)) + n!
3.! m = floor of (l + r)/2
! "! cnlg(n/2) + n!
4.! if k = A[m] return m
! =! cnlgn - cnlg2 + n!
5.! else if k < A[m] return Binary-Search-Rec(A, k, l, m-1)
6.! else return Binary-Search-Rec(A, k, m+1, r)
! =! cnlgn - cn + n!
! "! cnlgn ! ! ! for c >= 1.!

What is the running time of this algorithm for an input of size n?

Mathematical induction with a good guess! Mathematical induction with a good guess!
Suppose T(n) = 1 if n = 2, and T(n) = T(n/2) + !(1) if n = 2k, " Suppose T(n) = 1 if n = 1, and T(n) = T(n-1) + !(n) for n > 1.!
for k > 1.! Show T(n) = O(n2) by induction.!
Show T(n) = lgn by induction on the exponent k.! Basis: When n = 1. T(1) = 12 = 1.!
Basis: When k = 1, n = 2. T(2) = lg2 = 1.! IHOP: Assume T(k) = k2 for all n < k. "
IHOP: Assume T(2k) = lg2k for some constant k >1. " Inductive step: Show T(k) = k2.!
Inductive step: Show T(2k+1) = lg(2k+1) = k+1.! T(k) ! =! T(k-1) + k /* given */!
! =! (k-1)2 + k /* by inductive hypothesis */!
T(2k+1) =! T(2k) +1 /* given */! ! =! k2 - k + 1!
! =! (lg2k) + 1 /* by inductive hypothesis */! ! "! k2 for k > 1!
! =! k + 1 /* lg2k = k */!
Solving Recurrences: Master Method (§4.3)! Solving Recurrences: Master Method!
The master method provides a 'cookbook' method for solving
recurrences of a certain form. Intuition: Compare f(n) with !(nlogba).!
Master Theorem: Let a ! 1 and b > 1 be constants, let f(n) be a
function, and let T(n) be defined on nonnegative integers as: case 1: f(n) is "polynomially smaller than" !(nlogba)!
case 2: f(n) is "asymptotically equal to" !(nlogba)!
T(n) = aT(n/b) + f(n) ! case 3: f(n) is "polynomially larger than" !(nlogba)!
Recall, a is the number of subproblems and n/b is the size of each
subproblem.
What is logba? The number of times we divide a by b to reach O(1).!

Then, T(n) can be bounded asymptotically as follows:


log a log a-$
1.! T(n) = !(n b ) if f(n) = O(n b ) for some constant $ > 0
log a log a
2.! T(n) = !(n b lgn) if f(n) = !(n b )
log a+$
3.! T(n) = !f(n) if f(n) = %(n b ) for some constant $ > 0

Master Method Restated!


Review of Logarithms !
Master Theorem: If T(n) = aT(n/b) + O(nd) for some constants A logarithm is an inverse exponential function. Saying bx = y is!
a ! 1, b > 1, d ! 0, then
equivalent to saying logby = x.!

!(nlogba) if d < logba (a > bd) •! properties of logarithms:


T(n) = !(ndlgn) if d = logba (a = bd) logb(xy) = logbx + logby
!(nd) if d > logba (a < bd) logb (x/y) = logbx - logby
logbxa = alogbx
Why? The proof uses a recursion tree argument.
logba= logxa/logxb
a = blog a (e.g., n = 2lgn = nlg2)
b

lgkn = (lgn)k
lglg(n) = lg(lgn)

Recursion Tree for Merge-Sort! Solving Recurrences: Master Method (§4.3)!


Master Theorem: Let a ! 1 and b > 1 be constants, let f(n) be a
cn! ! ! cn!
function, and let T(n) be defined on nonnegative integers as:

cn/2! cn/2! ! ! cn! T(n) = aT(n/b) + f(n) !


lgn + 1 levels!
Then, T(n) can be bounded asymptotically as follows:
(h = lgn)! cn/4! cn/4! cn/4! cn/4! ! cn!
1.! T(n) = !(nlogba) if f(n) = O(nlogba-$) for some constant $ > 0
logba
2.! T(n) = !(n lgn) if f(n) = !(nlogba)
3.! T(n) = !f(n) if f(n) = %(nlogba+$) for some constant $ > 0
c! c! c! c! c! c! c! c! cn! and if a(f(n/b))" c(f(n)) for some positive constant
cnlgn + cn! c < 1 and all sufficiently large n.

T(n) =!
c! ! ! if n = 1!
Case 3 requires us to also show a(f(n/b))" c(f(n)), the
2T(n/2) + cn! ! otherwise! “regularity” condition.

Recurrence for worst-case running time for Merge-Sort! The regularity condition always holds whenever f(n) = nk and f(n)
log a+$
= %(n b ) , so we don’t need to check it when f(n) is a polynomial.
Solving Recurrences: Master Method (§4.3)! Solving Recurrences: Master Method (§4.3)!

These 3 cases do not cover all the possibilities for f(n). A more general version of Case 2 follows:

There is a gap between cases 1 and 2 when f(n) is smaller than nlogba, T(n) = !(nlogbalgk+1n) if f(n) = !(nlogbalgkn) for k ! 0
but not polynomially smaller.
This case covers the gap between cases 2 and 3 in which f(n) is
There is a gap between cases 2 and 3 when f(n) is larger than nlogba, but larger than nlogba by only a polylog factor. We’ll see an example of
not polynomially larger. this type of recurrence in class.

If the function falls into one of these 2 gaps, or if the regularity condition
can’t be shown to hold, then the master method can’t be used to solve
the recurrence.

Solving Recurrences: Master Method! Solving Recurrences: Master Method!

Example: T(n) = 9T(n/3) + n! Example: T(n) = T(n/2) + n2!


log a log 9 log a log 1
•! a = 9, b = 3, f(n) = n, n b = n 3 = n2! •! a = 1, b = 2, f(n) = n2, n b = n 2 = n0 = 1!
•! compare f(n) = n with n2 ! •! compare f(n) = n2 with 1 !
! log a
n = O(n2 -$) (so f(n) is polynomially smaller than n b )! ! n2 = %(n0+$) (so f(n) is polynomially larger)!
2
•! case 1 applies: T(n) = !(n )! •! Since f(n) is a polynomial in n, case 3 holds and T(n) = !(n2)

Example: T(n) = T(n/2) + 1!


Example: T(n) = 4T(n/2) + n2!
log a log 1
•! a = 1, b = 2, f(n) = 1, n b = n 2 = n0 = 1! log a log 4
•! compare f(n) = 1 with 1 ! •! a = 4, b = 2, f(n) = n2, n b = n 2 = n2 !
! 1 = !(n0) (so f(n) is polynomially equal to nlogba )! •! compare f(n) = n2 with n2 !
! n2 = !(n2) (so f(n) is polynomially equal)!
•! case 2 applies: T(n) = !(n0lgn) = !(lgn) !
•! Case 2 holds and T(n) = !(n2lgn)

Solving Recurrences: Master Method! Mini-Homework 4!


to be worked in-class 2/5/09!
Use the master method to give tight bounds for the running time of
Example: T(n) = 7T(n/3) + n2!
T(n) in each of the following recurrences. Specify which case of
•! a = 7, b = 3, f(n) = n2, nlogba = nlog37 = n1+ $ ! the master theorem holds for each problem.
•! compare f(n) = n2 with n1+ $ !
a) T(n) = 2T(n/2) + n3
! n2 = %(n1+$) (so f(n) is polynomially larger)!
•! Since f(n) is a polynomial in n, case 3 holds and T(n) = !(n2) b) T(n) = T(2n/3) + 1

c) T(n) = 16T(n/4) + n2
Example: T(n) = 7T(n/2) + n2!
d) T(n) = 5T(n/2) + n2
•! a = 7, b = 2, f(n) = n2, nlogba = nlog27 = n2+ $!
•! compare f(n) = n2 with n2+ $ ! e) T(n) = 5T(n/2) + n3
! n2 = O(n2+$) (so f(n) is polynomially smaller)!
•! Case 1 holds and T(n) = !(nlog27)

You might also like