[go: up one dir, main page]

0% found this document useful (0 votes)
15 views32 pages

3 Recurrence

Uploaded by

sandilya23panda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views32 pages

3 Recurrence

Uploaded by

sandilya23panda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Recurrence Relations.

Recurrence Relations
 Equation or an inequality that characterizes a
function by its values on smaller inputs.
 Solution Methods
 Substitution Method.
 Recursion-tree Method.
 Master Method.
 Iteration Method
 Recurrence relations arise when we analyze the
running time of iterative or recursive algorithms.

 Ex: Divide and Conquer.


T(n) = Θ(1) if n ≤ c
T(n) = a T(n/b) + D(n) + C(n) otherwise
Substitution Method
 Guess the form of the solution, then
use mathematical induction to show it is correct.
 Substitute guessed answer for the function when the
inductive hypothesis is applied to smaller values –
hence, the name.
 Works well when the solution is easy to guess.
 No general way to guess the correct solution.
Example – Exact Function
Recurrence: T(n) = 1 if n = 1
T(n) = 2T(n/2) + n if n > 1
Guess: T(n) = n lg n + n.
Induction:
•Basis: n = 1 ⇒ n lgn + n = 1 = T(n).
•Hypothesis: T(k) = k lg k + k for all k < n.
•Inductive Step: T(n) = 2 T(n/2) + n
= 2 ((n/2)lg(n/2) + (n/2)) + n
= n (lg(n/2)) + 2n
= n lg n – n + 2n
= n lg n + n
Example – With Asymptotics
To Solve: T(n) = 3T(n/3) + n
 Guess: T(n) = O(n lg n)
 Need to prove: T(n) ≤ cn lg n, for some c > 0.
 Hypothesis: T(k) ≤ ck lg k, for all k < n.
 Calculate:
T(n) ≤ 3c n/3 lg n/3 + n
≤ c n lg (n/3) + n
= c n lg n – c n lg3 + n
= c n lg n – n (c lg 3 – 1)
≤ c n lg n
(The last step is true for c ≥ 1 / lg3.)
Example – With Asymptotics
To Solve: T(n) = 3T(n/3) + n
 To show T(n) = Θ(n lg n), must show both upper and lower
bounds, i.e., T(n) = O(n lg n) AND T(n) = Ω(n lg n)
 Show: T(n) = Ω(n lg n)
 Calculate:
T(n) ≥ 3c n/3 lg n/3 + n
≥ c n lg (n/3) + n
= c n lg n – c n lg3 + n
= c n lg n – n (c lg 3 – 1)
≥ c n lg n
(The last step is true for c ≤ 1 / lg3.)
Example – With Asymptotics
If T(n) = 3T(n/3) + O (n), as opposed to T(n) = 3T(n/3) + n,
then rewrite T(n) ≤ 3T(n/3) + cn, c > 0.
 To show T(n) = O(n lg n), use second constant d, different from c.
 Calculate:
T(n) ≤ 3d n/3 lg n/3 +c n
≤ d n lg (n/3) + cn
= d n lg n – d n lg3 + cn
= d n lg n – n (d lg 3 – c)
≤ d n lg n
(The last step is true for d ≥ c / lg3.)
It is OK for d to depend on c.
Practice Examples
Making a Good Guess
 If a recurrence is similar to one seen before, then guess a
similar solution.
 T(n) = 3T(n/3 + 5) + n (Similar to T(n) = 3T(n/3) + n)
• When n is large, the difference between n/3 and (n/3 + 5) is
insignificant.
• Hence, can guess O(n lg n).
 Method 2: Prove loose upper and lower bounds on the
recurrence and then reduce the range of uncertainty.
 E.g., start with T(n) = Ω(n) & T(n) = O(n2).
 Then lower the upper bound and raise the lower bound.
Recursion-tree Method
 Making a good guess is sometimes difficult
with the substitution method.
 Use recursion trees to devise good guesses.
 Recursion Trees
 Show successive expansions of recurrences using
trees.
 Keep track of the time spent on the subproblems of
a divide and conquer algorithm.
 Help organize the algebraic bookkeeping necessary
to solve a recurrence.
Recursion Tree – Example
 Running time of Merge Sort:
T(n) = Θ(1) if n = 1
T(n) = 2T(n/2) + Θ(n) if n > 1
 Rewrite the recurrence as
T(n) = c if n = 1
T(n) = 2T(n/2) + cn if n > 1
c > 0: Running time for the base case and
time per array element for the divide and
combine steps.
Recursion Tree for Merge Sort
For the original problem, Each of the size n/2 problems
we have a cost of cn, has a cost of cn/2 plus two
plus two subproblems subproblems, each costing
each of size (n/2) and T(n/4).
running time T(n/2). cn

cn Cost of divide and


merge.

cn/2 cn/2

T(n/2) T(n/2)
T(n/4) T(n/4) T(n/4) T(n/4)
Cost of sorting
subproblems.
Recursion Tree for Merge Sort
Continue expanding until the problem size reduces to 1.
cn cn

cn/2 cn/2 cn

lg n
cn/4 cn/4 cn/4 cn/4 cn

c c c c c c cn
Total : cnlgn+cn
Recursion Tree for Merge Sort
Continue expanding until the problem size reduces to 1.
cn
•Each level has total cost cn.
•Each time we go down one level,
the number of subproblems doubles,
but the cost per subproblem halves
cn/2 cn/2
⇒ cost per level remains the same.
•There are lg n + 1 levels, height is
lg n. (Assuming n is a power of 2.)
cn/4 cn/4 cn/4 cn/4 •Can be proved by induction.
•Total cost = sum of costs at each
level = (lg n + 1)cn = cnlgn + cn =
Θ(n lgn).
c c c c c c
Practice Examples
 Use the recursion-tree method to determine a
guess for the recurrences
 T(n) = 2T(n/2) + 1
 T(n) = 2T(n/2) + n2
 T(n) = 3T(n/4) + Θ(n2).
 T(n) = T(n/3) + T(2n/3) + O(n).
 T(n) = T(n/10) + T(9n/10) + Θ(n).
 T(n) = T(n – 1) + T(n – 2) + Θ(1).
The Master Method
 Based on the Master theorem.
 “Cookbook” approach for solving recurrences
of the form
T(n) = aT(n/b) + f(n)
• a ≥ 1, b > 1 are constants.
• f(n) is asymptotically positive.
• n/b may not be an integer, but we ignore floors and
ceilings.
 Requires memorization of three cases.
The Master Theorem
Theorem:
Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and
Let T(n) be defined on nonnegative integers by the recurrence
T(n) = aT(n/b) + f(n), where we can replace n/b by n/b or n/b.
T(n) can be bounded asymptotically in three cases:
1. If f(n) = O(nlogba–ε) for some constant ε > 0, then T(n) = Θ(nlogba).
2. If f(n) = Θ(nlogba), then T(n) = Θ(nlogbalg n).
3. If f(n) = Ω(nlogba+ε) for some constant ε > 0,
and if, for some constant c < 1 and all sufficiently large n,
we have a·f(n/b) ≤ c f(n), then T(n) = Θ(f(n)).
Recursion tree view
f(n) f(n)
a

f(n/b) f(n/b) … f(n/b) af(n/b)


a a a

… … …
f(n/b2) f(n/b2) f(n/b2) f(n/b2) f(n/b2) f(n/b2) f(n/b2) f(n/b2) f(n/b2) a2f(n/b2)
a a a a a a
… … … … … …

Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(nlogba)
log b n −1
Total: T (n) = Θ(n log b a
)+ ∑ a j f (n / b j )
j =0
The Master Theorem
Theorem:
Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and
Let T(n) be defined on nonnegative integers by the recurrence
T(n) = aT(n/b) + f(n), where we can replace n/b by n/b or n/b.
T(n) can be bounded asymptotically in three cases:
1. If f(n) = O(nlogba–ε) for some constant ε > 0, then T(n) = Θ(nlogba).
2. If f(n) = Θ(nlogba), then T(n) = Θ(nlogbalg n).
3. If f(n) = Ω(nlogba+ε) for some constant ε > 0,
and if, for some constant c < 1 and all sufficiently large n,
we have a·f(n/b) ≤ c f(n), then T(n) = Θ(f(n)).
Master Method – Examples
 T(n) = 16T(n/4)+n
 a = 16, b = 4, nlogba = nlog416 = n2.
 f(n) = n = O(nlogba-ε) = O(n2-ε ), where ε = 1 ⇒ Case 1.
 Hence, T(n) = Θ(nlogba ) = Θ(n2).

 T(n) = T(3n/7) + 1
 a = 1, b=7/3, and nlogba = nlog 7/3 1 = n0 = 1
 f(n) = 1 = Θ(nlogba) ⇒ Case 2.
 Therefore, T(n) = Θ(nlogba lg n) = Θ(lg n)
Master Method – Examples
 T(n) = 3T(n/4) + n lg n
 a = 3, b=4, thus nlogba = nlog43 = O(n0.793)
 f(n) = n lg n = Ω(nlog43 + ε ) where ε ≈ 0.2 ⇒ Case 3.
 Therefore, T(n) = Θ(f(n)) = Θ(n lg n).

 T(n) = 2T(n/2) + n lg n
 a = 2, b=2, f(n) = n lg n, and nlogba = nlog22 = n
 f(n) is asymptotically larger than nlogba, but not
polynomially larger. The ratio lg n is asymptotically less
than nε for any positive ε. Thus, the Master Theorem
doesn’t apply here.
Master Theorem – What it means?
 Case 1: If f(n) = O(nlog a–ε) for some constant ε >
b

0, then T(n) = Θ(nlog a). b

 nlogba = alogbn : Number of leaves in the recursion tree.


 f(n) = O(nlogba–ε) ⇒ Sum of the cost of the nodes at each
internal level asymptotically smaller than the cost of leaves
by a polynomial factor.
 Cost of the problem dominated by leaves, hence cost is
Θ(nlogba).
Master Theorem – What it means?
 Case 2: If f(n) = Θ(nlog a), then T(n) = Θ(nlog alg n).
b b

 nlogba = alogbn : Number of leaves in the recursion tree.


 f(n) = Θ(nlogba) ⇒ Sum of the cost of the nodes at each
level asymptotically the same as the cost of leaves.
 There are Θ(lg n) levels.
 Hence, total cost is Θ(nlogba lg n).
Master Theorem – What it means?
 Case 3: If f(n) = Ω(nlog a+ε) for some constant ε > 0,
b

and if, for some constant c < 1 and all sufficiently large n,
we have a·f(n/b) ≤ c f(n), then T(n) = Θ(f(n)).

 nlogba = alogbn : Number of leaves in the recursion tree.


 f(n) = Ω(nlogba+ε) ⇒ Cost is dominated by the root. Cost of
the root is asymptotically larger than the sum of the cost of
the leaves by a polynomial factor.
 Hence, cost is Θ(f(n)).
Practice Examples
 Use the Master’s method to solve the following
recurrences. If a recurrence can not be solved by the
master’s method, state the appropriate reason for it.
 T(n) = 4T(n/2) + n
 T(n) = 4T(n/2) + n2
 T(n) = 16T(n/2) + n2
 T(n) = 7T(n/2) + n2
 T(n) = 2T(n/4) + Θ(1)
 T(n) = 2T(n/4) + 𝑛𝑛
 T(n) = 2T(n/4) + Θ(n)
 T(n) = 2T(n/4) + n2
Practice Examples
 Use the Master’s method to solve the following
recurrences. If a recurrence can not be solved by the
master’s method, state the appropriate reason for it.
 T(n) = T(3n/7) + Θ(1)
 T(n) = 3T(n/4) + 𝑛𝑛 log 𝑛𝑛
 T(n) = T(n/2) + Θ(1)
 T(n) = 3T(n/2) + Θ(n)
 T(n) = 4T(n/2) + n2 log 𝑛𝑛
Iteration Method
 A "brute force" method of solving a recurrence
relation.
 It is quite similar to the recursion tree method
(summation is same)
 Usually considered for recurrences involving
subtraction to reduce the problem size
 The general idea is to iteratively substitute the value
of the recurrent part of the equation until a pattern
(usually a summation) is noticed which can be used
to evaluate the recurrence.
Example: Iteration Method

Given recurrence, T (n) = T (n-1) +1 and T (1) = θ (1).


T (n) = T (n-1) +1
= (T (n-2) +1) +1 = (T (n-3) +1) +1+1
= T (n-4) +4 = T (n-5) +1+4
= T (n-5) +5= T (n-k) + k
When k = n-1
T (n-k) = T (1) = θ (1)
T (n) = θ (1) + (n-1)
= 1+n-1
= n = θ (n).
Iteration Method
Iteration Method
Changing the variable

Practice Examples: 𝑇𝑇 𝑛𝑛 = 2𝑇𝑇 𝑛𝑛 + 1


𝑇𝑇 𝑛𝑛 = 2𝑇𝑇 𝑛𝑛 + 𝑛𝑛
Some more recurrences

You might also like