Unit 2 - Divide and Conquer & Greedy Strategy
Unit 2 - Divide and Conquer & Greedy Strategy
1
Syllabus
2
2
Optimization problems
33
Introduction : Greedy Method
• Greedy algorithms don’t always yield optimal solutions but, when they
do, they’re usually the simplest and most efficient algorithms available.
4
Introduction : Greedy Method
55
Introduction : Greedy Method
• The greedy method suggests that one can divide the algorithm that works
in stages. Considering one input at a time.
• At each stage a decision is made regarding weather or particular input is in
the optimum solution.
• For this purpose all the inputs must be arranged in a particular order by
using a selection process.
• If the inclusion of next input in to the partially constructed solution will
result in an infeasible solution then the input will not be considered and
will not be added to partially constructed set otherwise it is added.
• CHANGE-MAKING PROBLEM (coin changing)
66
Coin Changing
An optimal solution to the coin changing problem is the minimum number of
coins whose total value equals a specified amount. For example what is the
minimum number of coins (current U.S. mint) needed to total 83 cents.
Objective function: Minimize number of coins returned.
Greedy solution: Always return the largest coin you can
A Greedy Algorithm for
Coin Changing
83 - 25 = 58
58 - 25 = 33 1. Set remval=initial_value
33 - 25 = 8
2. Choose largest coin that
8 - 5 = 3 is less than remval.
3 - 1 = 2
3. Add coin to set of coins
2 - 1 = 1 and set
remval:=revamal-coin_value
1 - 1 = 0
4. repeat Steps 2 and 3 77
Greedy Method
• Feasible Solution:-
Any subset of the solutions that satisfies the constraints of the problem
is known as a feasible solution.
• Optimal Solution:-
The feasible solution that maximizes or minimizes the given objective
function is an optimal solution.
Example : MST
8
Control Abstraction : Greedy Method
Algorithm Greedy (a,n)
//a[1:n] contains the ‘n’ inputs
{
Solution: = 0; //initialize the solution
for i: = 1 to n do
{
x: = select (a);
if Feasible (solution, x) then
solution: = Union (solution, x);
}
return solution;
}
9
Knapsack Problem
• Given :
• A knapsack with capacity ‘m’
• A set of ‘n’ objects with each item i having
• pi - a positive benefit
• wi - a positive weight
• Goal: Choose items with maximum total benefit but with weight at most m.
and 0 ≤ xi ≤ 1 , 1 ≤ i ≤ n
10
Knapsack Problem : Example
Ex: - Consider following instance of Knapsack Problem with 3 objects whose
profits and weights are defined as
(P1, P2, P3) = ( 25, 24, 15 ) (W1, W2, W3) = ( 18, 15, 10 )
n=3 Knapsack Capacity m=20
Determine the optimum strategy for placing the objects in to the
knapsack.
The four feasible solutions are :
(1) Greedy about profit: -
Profit Weight
(x1, x2, x3) Order Order ∑ xiwi <=20 ∑ xipi
(p1,p2,p3) (w1,w2,w3)
(1, 2/15, 0) (25,24,15) (18, 15, 10 ) 18 x 1+(2/15) x 15 = 20 25 x 1+ (2/15) x 24 = 28.2
Profit Weight
(x1, x2, x3) Order Order ∑ xiwi <=20 ∑ xipi
(p3,p2,p1) (w3,w2,w1) 18 x 0+(2/3) x 15+10 =
(0, 2/3, 1) (15, 24,25) (10,15,18 ) 20 25 x0+ (2/3) x 24 +25= 31 11
11
(3) Greedy about profit / unit weight: -
(4) If an additional constraint of including each and every object is placed then the greedy
strategy could be
12
Algorithm : Greedy Knapsack
Algorithm Greedy knapsack (m, n)
//p (1:n) and w(1:n) contain the profits and weights resp. of the n objects
//ordered such that p(i)/W(i) p(i+1)/w (i+1). M is the knapsack size and x (1:n)
// is the solution vector
{
for i: = 1 to n do x(i) = 0.0i// initialize x O(n)
u : = m;
for i: = 1 to n do O(n)
{
if (w(i) > u) then break;
x(i): = 1.0;
u: = u-w(i);
}
if (i≤n) then x(i): = u/w(i);
}
• Analysis: - If. we do not consider the time considered for sorting the inputs then the complexity will be O(n).
13
Knapsack : More examples
3) Let us consider that the capacity of the knapsack W =60 and the list of provided
items are shown in the following table −
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
14
Huffman Coding
• Huffman codes can be used to compress information
• JPEGs do use Huffman as part of their compression process
15
Huffman Coding
e,3 d,2 u,2 l,2 sp,2 k,1 b,1 v,1 i,1 s,1
16
Huffman Coding
• We then pick the nodes with the smallest frequency and combine
them together to form a new node
• The selection of these nodes is the Greedy part
• The two selected nodes are removed from the set, but replace by the
combined node
17
Huffman Coding
e,3 d,2 u,2 l,2 sp,2 k,1 b,1 v,1 i,1 s,1
18
Huffman Coding
i,1 s,1
19
Huffman Coding
20
Huffman Coding
b,1 v,1
21
Huffman Coding
b,1 v,1
22
Huffman Coding
e,3 4 4 3 2
b,1 v,1
23
Huffman Coding
e,3 4 4 5
b,1 v,1
24
Huffman Coding
7 4 5
b,1 v,1
25
Huffman Coding
7 9
e,3 4 4 5
b,1 v,1
26
Huffman Coding
16
7 9
e,3 4 4 5
b,1 v,1
27
Huffman Coding
• A traversal of the tree from root to leaf give the Huffman code for
that particular leaf character
28
Huffman Coding e 00
16 d 010
u 011
7 9 l 100
sp 101
e,3 4 4 5 i 1100
s 1101
d,2 u,2 l,2 sp,2 2 3 k 1110
b 11110
i,1 s,1 k,1 2 v 11111
b,1 v,1
* Using variable length encoding
29
Huffman
HUFFMAN (C )
Encoding cormen
1. n= |C|
//initializes the min-priority queue Q with the characters in C
2. Q = C O(n)
//loop extracts the two nodes x and y of lowest frequency from the queue
3. for i = 1 to n – 1 O(n log n)
//replacing them in the queue with a new node z representing their merger
4. allocate a new node z
5. z.left = x = EXTRACT-MIN(Q) O(log n)
6. z.right = y = EXTRACT-MIN(Q)
7. z.freq = x:freq + y:freq
8. INSERT (Q,z)
//After n-1 mergers, line 9 returns the one node left in the queue,which is the root of the code tree
9. return EXTRACT-MIN(Q) // return the root of the tree
Analysis : Must sort n values before making n choices. Therefore, Huffman is O(n log n) +
O(n) = O(n log n)
30
30
Huffman Coding
• These codes are then used to encode the string
32
Examples : Huffman Coding
• Solve the following using Huffman's code generation algorithm
34
34
JOB SEQUENCING WITH
DEADLINES
The problem is stated as below.
• There are n jobs to be processed on a machine.
• Each job i has a deadline di≥ 0 and profit pi≥0 .
• Pi is earned iff the job is completed by its deadline.
• The job is completed if it is processed on a
machine for unit time.
• Only one machine is available for processing jobs.
• Only one job is processed at a time on the
machine. 35
35
JOB SEQUENCING WITH
DEADLINES (contd..)
• A feasible solution is a subset of jobs J such that
each job is completed by its deadline.
• An optimal solution is a feasible solution with
maximum profit value.
• NO LATER THAN THEIR RESPECTIVE
DEADLINES.
Example : Let n = 4, (p1,p2,p3,p4) = (100,10,15,27),
(d1,d2,d3,d4) = (2,1,2,1)
36
36
JOB SEQUENCING WITH
DEADLINES (contd..)
Example : Let n = 4, (p1,p2,p3,p4) = (100,10,15,27),
(d1,d2,d3,d4) = (2,1,2,1)
Sr.No. Feasible Processing Profit value
Solution Sequence
(i) (1,2) (2,1) 110
(ii) (1,3) (1,3) or (3,1) 115
(iii) (1,4) (4,1) 127 is the optimal one
(iv) (2,3) (2,3) 25
(v) (3,4) (4,3) 42
(vi) (1) (1) 100
(vii) (2) (2) 10
(viii) (3) (3) 15
37
(ix) (4) (4) 27 37
Example Explanation
• Consider the jobs in the non increasing order of
profits subject to the constraint that the resulting
job sequence J is a feasible solution.
• In the example considered before, the non-
increasing profit vector is
(100 27 15 10) (2 1 2 1) p1 p 4 p 3
p2 d1 d4 d3 d2
38
38
GREEDY ALGORITHM TO OBTAIN
AN OPTIMAL SOLUTION (Contd..)
(100 27 15 10) (2 1 2 1)
p1 p4 p3 p2 d1 d4 d3 d2
J = { 1} is a feasible one
J = { 1, 4} is a feasible one with processing
sequence ( 4,1)
J = { 1, 3, 4} is not feasible
J = { 1, 2, 4} is not feasible
J = { 1, 4} is optimal
39
39
Examples :Job Sequencing Problems:
1) Write a greedy algorithm for sequencing unit time jobs with deadlines
and profits. Using this algorithm, find the optimal solutions when
n=5,(p1,p2,p3,p4,p5)=(20,15,10,5,1) and (d1,d2,d3,d4,d5)=(2,2,1,3,3).
40
Job sequencing with deadline
41
41
Job sequencing with deadline
42
42
Job sequencing with deadline
43
43
GREEDY ALGORITHM TO OBTAIN
AN OPTIMAL SOLUTION (Contd..)
J(r+1)i ; k k+1
// i is inserted at position r+1 //
// and total jobs in J are increased by one //
repeat
end JS
44
44
Algorithm for Job
sequencing with Deadlines
45
COMPLEXITY ANALYSIS OF JS
ALGORITHM
• Let n be the number of jobs and s be the number of
jobs included in the solution.
• The loop between lines 4-15 (the for-loop) is
iterated (n-1)times.
• Each iteration takes O(k) where k is the number of
existing jobs.
The time needed by the algorithm is 0(sn) s n so
the worst case time is 0(n2).
If di = n - i+1 1 i n, JS takes θ(n2) time
D and J need θ(s) amount of space. 46
46