[go: up one dir, main page]

100% found this document useful (1 vote)
315 views25 pages

On Recurrence Relation

The document discusses recurrence relations including definition, formation, methods to solve, and examples. It defines recurrence relations as equations that define a sequence based on a rule giving the next term as a function of previous terms. Common methods to solve recurrences discussed are substitution method, recursion tree method, and Master's method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
315 views25 pages

On Recurrence Relation

The document discusses recurrence relations including definition, formation, methods to solve, and examples. It defines recurrence relations as equations that define a sequence based on a rule giving the next term as a function of previous terms. Common methods to solve recurrences discussed are substitution method, recursion tree method, and Master's method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Recurrence Relation

What is recurrence relation?


To be discussed
• Definition
• Formation of recurrence relation
• Methods to solve recurrence relation
• Master theorem for dividing function

~ Rezaur Rahman
S.No.: 47
Class: IT-2, 5th Sem
Recurrences and Examples
• An equation that defines a sequence based on a rule that
gives the next term as a function of the previous term(s).
T(n) = T(n-1) + n
• Recurrences arise when an algorithm contains recursive
calls to itself
• Example: Fibonacci series where succeeding terms are

dependent on last two preceding terms.

f(n)=f(n-1)+f(n-2)

with the initial values f(0)=0 and f(1)=1


2
Example Recurrences
• T(n) = T(n-1) + n Θ(n2)
– Recursive algorithm that loops through the input to
eliminate one item
• T(n) = T(n/2) + c Θ(lgn)
– Recursive algorithm that halves the input in one step
• T(n) = T(n/2) + n Θ(n)
– Recursive algorithm that halves the input but must
examine every item in the input
• T(n) = 2T(n/2) + 1 Θ(n)
– Recursive algorithm that splits the input into 2 halves
and does a constant amount of other work

3
Recurrent Algorithms
BINARY-SEARCH
• for an ordered array A, finds if x is in the array A[lo…hi]

Alg.: BINARY-SEARCH (A, lo, hi, x)


1 2 3 4 5 6 7 8

if (lo > hi) 2 3 5 7 9 10 11 12


return FALSE
mid  (lo+hi)/2 mid
if x = A[mid] lo hi

return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)

4
Analysis of BINARY-SEARCH
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi) constant time: c1
return FALSE
mid  (lo+hi)/2 constant time: c2
if x = A[mid]
constant time: c3
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)same problem of size n/2
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)same problem of size n/2

• T(n) = c + T(n/2)
– T(n) – running time for an array of size n
5
Methods for Solving Recurrences

• Substitution method(iteration)

• Recursion tree method

• Master method

6
The Substitution Method
• Convert the recurrence into a summation and try
to bound it using known series
– Use back-substitution to express the recurrence in
terms of n and the initial (boundary) condition.
– Iterate the recurrence until the initial condition is
reached.

7
Substitution Method - Examples
T(n) = 1 + T(n/2)
T(n) = 1 + T(n/2) T(n/2) = 1 + T(n/4)
= 1 + 1 + T(n/4) T(n/4) = 1 + T(n/8)
= 1 + 1 + 1 + T(n/8)
After substituting for k times, we get
T(n) = 1 + 1 + … + 1 + T(n/ 2k)
k times
= k + T(n/ 2k)
Now T(n/ 2k) = T(1)

8
Substitution Method - Examples
=> n/ 2k = 1
=> 2k = n
=> k = log(n)
Time complexity= Θ(log n)

9
Substitution Method – Examples
T(n) = n + 2T(n/2)
T(n) = n + 2T(n/2) T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)

After k times, we get
T(n) = kn + 2kT(n/2k) …(i)

10
Substitution Method - Examples
Assume T(n/2k) = T(1)
n/2k = 1
2k = n …(ii)
k= log n …(iii)
Put (ii) & (iii) in (i), we get
T(n) = n T(1) + k n
T(n) = n x 1 + n log (n)
T(n) = n + n log (n)

So,Time complexity= Θ(n log n)


11
The recursion-tree method

Convert the recurrence into a tree:


– Each node represents the cost incurred at various
levels of recursion
– Sum up the costs of all levels

Used to “guess” a solution for the recurrence

12
Example 1
W(n) = 2W(n/2) + n2

• Subproblem size at level i is: n/2i


• Subproblem size hits 1 when 1 = n/2i  i = lgn
• Cost of the problem at level i = (n/2i)2 No. of nodes at level i = 2i
• Total cost: lg n 1 2 lg n 1 i  i
n 1 1 1
W ( n)  2
i 0
i
 2lg n W (1)  n 2     n  n 2     O(n) n 2
i 0  2  i 0  2  1 1
 O ( n)  2n 2
2
 W(n) = O(n2)
13
Example 2
E.g.: T(n) = 3T(n/4) + cn2

• Subproblem size at level i is: n/4i


• Subproblem size hits 1 when 1 = n/4i  i = log4n
• Cost of a node at level i = c(n/4i)2
• Number of nodes at level i = 3i  last level has 3log4n = nlog43 nodes
• Total cost:
log4 n 1 i i

     

3 2 3 1
T ( n)     cn   n
 16 
log4 3
    cn 2   n log4 3 
i  0  16 
3
cn 2   n log4 3  O(n 2 )
i 0
1
16 14
 T(n) = O(n2)
Example 2 – Solving by substitution
method
T(n) = 3T(n/4) + cn2 …(i)

first find T(n/4) from above relation, we get


=> T(n/4) = 3T(n/42) + c(n/4)2
=> 3T(n/4) = 32T(n/42) + 3c(n/4)2 …(ii)
put (ii) in (i), we get
=> T(n) = 32T(n/42) + 3c(n/4)2 + cn2 …(iii)

Now, find T(n/42) from eq.(i), we get


=> T(n/42) = 3T(n/43) + c(n/42)2
=> 32T(n/42) = 33T(n/43) + 32c(n/42)2 …(iv)
put (iv) in (iii), we get
T(n) = 33T(n/43) + 32c(n/42)2 + cn2

After k times, we get (where k is large)


T(n) = 3kT(n/4k) + cn2 + 32c(n/42)2 + 33c(n/43)2 + ……… + 3k-1c(n/4k-1)2

15
Solving by substitution method
T(n) = 3kT(n/4k) + cn2 + 32c(n/42)2 + 33c(n/43)2 + ……… + 3k-1c(n/4k-1)2
= cn 2(1 + 3/42 + (3/42)2 + (3/42)3 + ……)

Clearly, we can see in above equation it is forming GP series.


Also, the common ratio is 3/42 = 3/16 which is less than 1.
Formula: Sum of infinite GP series =
where a= first term
and r = common ratio

T(n) = cn2[
T(n) = d n2 where d is some constant
Θ(n2)

16
Example 3 (simpler proof)
W(n) = W(n/3) + W(2n/3) + n

• The longest path from the root to


a leaf is:
n  (2/3)n  (2/3)2 n  …
1
• Subproblem size hits 1 when
1 = (2/3)in  i=log3/2n

• Cost of the problem at level i = n


• Total W
cost: lg n
(n)  n  n  ...  n(log 3/ 2 n)  n  O(n lg n)
3
lg
2
17
Example 3 - Substitution
W(n) = W(n/3) + W(2n/3) + n …(i)
=> W(n/3) = W(n/32) + W(2n/32) + n/3 …(ii)
=> W(2n/3) = W(2n/32) + W(n(2/3)2) + 2n/3 …(iii)
Put (ii) & (iii) in (i), we get

=> W(n) = W(n/32) + 2W(2n/32) + W(n(2/3)2) + n + n


After k times, we get
=> W(n) = W(n/3k) + k W(2n/3k) + W(n(2/3)k) + n + n + ……+ n
{k times)
=> W(n) = W(n(2/3)k) + k n …(iv)
Now we will find the value of k

=> W(n(2/3)k) =W(1)


=> n(2/3)k = 1
=> n = (3/2)k
18
Solving by substitution method
=> log n = k log(3/2)

=> k =
Now from equation (iv), we have
W(n) = W(n(2/3)k) + k n …(iv)

Time complexity = O(k n)


But k log n
So, time complexity = O(n log n)

19
Master’s method
• For solving recurrences of the form:
n
T ( n)  aT    f ( n)
b
where, a ≥ 1, b > 1, and f(n) = Θ(nk logp n)
Case1: if logab > k , then Θ(nlogab)

Case2: if logab = k
if p > -1, Θ(nk logp+1 n)
if p = -1, Θ(nk log log n)
if p > -1, Θ(nk)

Case3: if logab < k


if p 0, Θ (nk logp n)
if p 0, Θ (nk) 20
Examples
T(n) = 2T(n/2) + n
n
Sol: Compare it with
T ( n )  aT    f ( n)
b
Where f(n) = Θ(nk logp n)
a = 2, b = 2, k = 1, p = 0, log 22 = 1 = k
logab = k ……… ( case 2)

Case2: if logab = k
if p > -1, Θ(nk logp+1 n)
if p = -1, Θ(n log log n)
if p > -1, Θ(nk)

Clearly here p = 0. So p > -1

Case2: if logab = k
if p > -1, Θ(nk logp+1 n)
21
Time Complexity = Θ(n log n)
Examples
T(n) = 2T(n/2) + n2
n
T (
Sol: Compare it with n )  aT    f ( n)
b
Where f(n) = Θ(nk logp n)

a = 2, b = 2, k=2, p = 0
=> log22 = 1 k (Case 3)

Case3: if logab < k)


if p 0, Θ (nk logp n)
if p 0, Θ (nk)

Θ(n2)

22
Examples

T(n) = 2T(n/2) + n

Sol: Compare it with


n
T ( n)  aT    f ( n)
b
Where f(n) = Θ(nk logp n)
=> a = 2, b = 2, k = , p = 0
=> log22 = 1 k ( Case 1)

Case1: if logab > k , then Θ(nlogab)

Time Complexity= Θ(n)

23
Examples

T(n) = 3T(n/4) + n log n


n
T ( n)  aT    f ( n)
Sol: Compare it with
b
Where f(n) = Θ(nk logp n)
=> a = 3, b = 4, k = 1, p = 1,

=> log43 = 0.793 k ( Case 3 Problem)

Case3: if logab < k


if p 0, Θ (nk logp n)
if p 0, Θ (nk)

In above example, p = 1 0

Time Complexity= Θ (n log n) 24


Examples
T(n) = 2T(n/2) +
Sol: Compare it with
n
T ( n)  aT    f ( n)
where f(n) = Θ(nk logp n)  b 

=> a = 2, b = 2, k = 1, p = -1

=> log22 = 1 = k case 2 problem

Case2: if logab = k
if p > -1, Θ(nk logp+1 n)
if p = -1, Θ(nk log log n)
if p > -1, Θ(nk)
We have p = -1

Case2: if logab = k
if p = -1, Θ(nk log log n)

Time complexity = Θ(n log log n)


25

You might also like