UNIT 3
Divide and Conquer
Divide-and-Conquer
The most-well known algorithm design strategy:
1. Divide instance of problem into two or more smaller
instances
2. Solve smaller instances recursively
3. Obtain solution to original (larger) instance by
combining these solutions
4. Divide-and-conquer technique is ideally suited for
parallel computations, in which each subproblem can
be solved simultaneously by its own processor.
Divide-and-Conquer Technique (cont.)
a problem of size n
(instance)
subproblem 1 subproblem 2
of size n/2 of size n/2
a solution to a solution to
subproblem 1 subproblem 2
a solution to It generally leads to a
the original problem
recursive algorithm!
Algorithms
Merge Sort
Quick Sort
Binary Tree Traversals and related properties
Multiplication of two large integers
Strassen’s Matrix Multiplication
General Divide-and-Conquer Recurrence
T(n) = aT(n/b) + f (n) where f(n) (nd), d 0
Master Theorem: If a < bd, T(n) (nd)
If a = bd, T(n) (nd log n)
If a > bd, T(n) (nlog b a )
Note: The same results hold with O instead of .
Examples: T(n) = 4T(n/2) + n T(n) ? (n^2)
T(n) = 4T(n/2) + n2 T(n) ? (n^2log n)
T(n) = 4T(n/2) + n3 T(n) ? (n^3)
Divide and Conquer
Divide and Conquer
Merge Sort Example
Merge Sort Example
Mergesort Example
8 3 2 9 7 1 5 4
8 3 2 9 7 1 5 4
The non-recursive
version of Mergesort
8 3 2 9 71 5 4 starts from merging
single elements into
8 3 2 9 7 1 5 4 sorted pairs.
3 8 2 9 1 7 4 5
2 3 8 9 1 4 5 7
1 2 3 4 5 7 8 9
Analysis of Mergesort
All cases have same efficiency: Θ(n log n)
T(n) = 2T(n/2) + Θ(n), T(1) = 0
Number of comparisons in the worst case is close to
theoretical minimum for comparison-based sorting:
log2 n! ≈ n log2 n - 1.44n
Space requirement: Θ(n) (not in-place)
Can be implemented without recursion (bottom-up)
Quicksort
Select a pivot (partitioning element) – here, the first element
Rearrange the list so that all the elements in the first s
positions are smaller than or equal to the pivot and all the
elements in the remaining n-s positions are larger than or
equal to the pivot (see next slide for an algorithm)
A[i]p A[i]p
Exchange the pivot with the last element in the first (i.e., )
subarray — the pivot is now in its final position
Sort the two subarrays recursively
Divide and Conquer
Partitioning Algorithm
or i > r
< or j = l
Time complexity: Θ(r-l) comparisons
Quicksort Example
5 3 1 9 8 2 4 7
2 3 1 4 5 8 9 7
1 2 3 4 5 7 8 9
1 2 3 4 5 7 8 9
1 2 3 4 5 7 8 9
1 2 3 4 5 7 8 9
Analysis of Quicksort
Best case: split in the middle — Θ(n log n)
Worst case: sorted array! — Θ(n2) T(n) = T(n-1) + Θ(n)
Average case: random arrays — Θ(n log n)
Improvements:
• better pivot selection: median of three partitioning
• switch to insertion sort on small subfiles
• elimination of recursion
These combine to 20-25% improvement
Considered the method of choice for internal sorting of large
files (n ≥ 10000)
Quick Sort : Example
Worst case time complexity
Best case time complexity
T(n) = 2T(n/2) + n using Master Theorem T(n) = Ω(n log 2n)
Binary Tree Traversals and Related
Properties
The most important divide-and-conquer algorithms for
binary trees are the three classic traversals: preorder,
inorder, and postorder.
All three traversals visit nodes of a binary tree recursively,
i.e., by visiting the tree’s root and its left and right subtrees.
preorder traversal: the root is visited before the left and
right subtrees are visited.
inorder traversal: the root is visited after visiting its left
subtree but before visiting the right subtree.
postorder traversal: the root is visited after visiting the left
and right subtrees.
Height of a Tree
ALGORITHM Height(T )
//Computes recursively the height of a binary tree
//Input: A binary tree T
//Output: The height of T
if T = ∅
return −1
else
return max{Height(Tlef t ), Height(Tright)} + 1
Leaf Count
ALGORITHM LeafCount(T )
//Computes recursively the number of leaves in a binary tree
//Input: A binary tree T
//Output: The leaf count of T
if (T!= NULL)
{
LeafCount(T->left);
if ((T->left == NULL) && (T->right == NULL))
{
count++;
}
LeafCount(T->right);
}
return count;
Time Complexity
The time complexity of Tree Traversal and Leaf count:
O(n) because visit all nodes of the binary tree
Time complexity to find the height of a Binary Search
Tree is
Best Case: O(n)
Recurrence: T(n)=2T(n/2)+c.
Here T(n/2) is for each of the recursive calls, and c for all
the rest.
worst case: O(n)
Recurrence: T(n)=T(n−1)+c
Multiplication of Large Integers
Multiplying two n-digit integers: each of the n digits of the first
number is multiplied by each of the n digits of the second number for
the total of n2 digit multiplications.
Divide and Conquer technique can be used to increase the efficiency.
To demonstrate the basic idea of the algorithm, let us start with a
case of two-digit integers, say, 23 and 14. These numbers can be
represented as follows:
23 = 2 . 101 + 3 . 100 and 14 = 1 . 101 + 4 . 100.
Now let us multiply them:
23 ∗ 14 = (2 . 101 + 3 . 100 ) ∗ (1 . 101 + 4 . 100)
= (2 ∗ 1)102 + (2 ∗ 4 + 3 ∗ 1)101 + (3 ∗ 4)100. = 322
We can compute the middle term with just one digit multiplication
by taking advantage of the products 2 ∗ 1 and 3 ∗ 4 that need to be
computed anyway:
2 ∗ 4 + 3 ∗ 1= (2 + 3) ∗ (1+ 4) − 2 ∗ 1− 3 ∗ 4.
For any pair of two-digit numbers a = a1a0 and b = b1b0, their
product c can be computed by the formula
c = a ∗ b = c2102 + c1101 + c0,
Where
c2 = a1 ∗ b1 is the product of their first halves,
c0 = a0 ∗ b0 is the product of their second halves,
c1 = (a1 + a0) ∗ (b1 + b0) − (c2 + c0) is the product of the sum
of the a’s halves and the sum of the b’s halves minus the sum of c2
and c0.
c = a ∗ b = c2102 + c1101 + c0, c= 12*31= c2*100+c1*10+c0
c2 = a1 ∗ b1 a1=1 a0=2 c2= 1 * 3 = 3
c0 = a0 ∗ b0 b1=3 b0=1 c0= 2 * 1 = 2
c1 = (a1 + a0) ∗ (b1 + b0) − (c2 + c0) c1= (1+2)*(3+1)-(3+2)
c1= 12-5 = 7
c=3*100 + 7*10 + 2 = 372
Time Complexity
If n/2 is even, we can apply the same method for computing the
products c2, c0, and c1. Thus, if n is a power of 2, we have a
recursive algorithm for computing the product of two n-digit
integers.
T(n) = 3T(n/2) for n > 1, T(1) = 1
Using Master Theorem
a=3, b=2, d=0 If a > bd, T(n) (nlog b a )
T(n) = 3log2 n = nlog2 3 ≈ θ(n1.585)
Strassen’s Matrix Multiplication
Basic Matrix Multiplication:
Suppose we want to multiply two matrices of size N
x N: for example A x B = C.
C11 = a11b11 + a12b21
C12 = a11b12 + a12b22
C21 = a21b11 + a22b21 2x2 matrix multiplication can be
accomplished in 8 multiplication.(2log28 =23)
C22 = a21b12 + a22b22
Basic Matrix Multiplication
void matrix_mult (){
for (i = 1; i <= N; i++) {
for (j = 1; j <= N; j++) {
compute Ci,j; algorithm
}
}}
N
Ci , j ai ,k bk , j
Time analysis k 1
N N N
Thus T ( N ) c cN 3 O( N 3 )
i 1 j 1 k 1
Strassens’s Matrix Multiplication
Strassen showed that 2x2 matrix multiplication
can be accomplished in 7 multiplication and 18
additions or subtractions. .(2log27 =22.807)
This reduce can be done by Divide and Conquer
Approach.
Divide-and-Conquer
Divide-and conquer is a general algorithm design
paradigm:
• Divide: divide the input data S in two or more disjoint
subsets S1, S2, …
• Recur: solve the subproblems recursively
• Conquer: combine the solutions for S1, S2, …, into a
solution for S
The base case for the recursion are subproblems of
constant size
Analysis can be done using recurrence equations
Divide and Conquer Matrix Multiply
A B = R
A0 A1 B0 B1 A0B0+A1B2 A0B1+A1B3
=
A2 A3 B2 B3 A2B0+A3B2 A2B1+A3B3
•Divide matrices into sub-matrices: A0 , A1, A2 etc
•Use blocked matrix multiply equations
•Recursively multiply sub-matrices
Divide and Conquer Matrix Multiply
A B = R
a0 b0 = a0 b0
• Terminate recursion with a simple base case
Strassens’s Matrix Multiplication
P1 = (A11+ A22)(B11+B22)
P2 = (A21 + A22) * B11 C11 = P1 + P4 - P5 + P7
P3 = A11 * (B12 - B22) C12 = P3 + P5
P4 = A22 * (B21 - B11) C21 = P2 + P4
P5 = (A11 + A12) * B22 C22 = P1 + P3 - P2 + P6
P6 = (A21 - A11) * (B11 + B12)
P7 = (A12 - A22) * (B21 + B22)
Comparison
C11 = P1 + P4 - P5 + P7
= (A11+ A22)(B11+B22) + A22 * (B21 - B11) - (A11 + A12) * B22+
(A12 - A22) * (B21 + B22)
= A11 B11 + A11 B22 + A22 B11 + A22 B22 + A22 B21 – A22 B11 -
A11 B22 -A12 B22 + A12 B21 + A12 B22 – A22 B21 – A22 B22
= A11 B11 + A12 B21
Strassen’s Matrix Multiplication
Time Complexity
T(n) = 7T(n/2) for n > 1, T(1) = 1
Using Master Theorem
a=7, b=2, d=0 If a > bd, T(n) (nlog b a )
T(n) = 7log2 n = nlog2 7 ≈ θ(n2.807)