[go: up one dir, main page]

0% found this document useful (0 votes)
19 views42 pages

Unit-II_Greedy Method (2)

Uploaded by

Sameer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views42 pages

Unit-II_Greedy Method (2)

Uploaded by

Sameer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

UNIT – II

Greedy Method
General method
Applications
Knapsack problem
Job sequencing with deadlines
Minimum cost spanning trees
Single source shortest path problem
Greedy Method
• Greedy is the most straight forward design
technique.
• Most of the problems have n inputs and require us
to obtain a subset that satisfies some constraints.
• Any subset that satisfies these constraints is called
a feasible solution.
• We need to find a feasible solution that either
maximizes or minimizes the objective function.
• A feasible solution that does this is called an
optimal solution.
Greedy Method
• The greedy method is a simple strategy of
progressively building up a solution, one element
at a time, by choosing the best possible element at
each stage.
• At each stage, a decision is made regarding
whether or not a particular input is in an optimal
solution.
• This is done by considering the inputs in an order
determined by some selection procedure.
• If the inclusion of the next input, into the partially
constructed optimal solution will result in an
infeasible solution then this input is not added to
the partial solution.
Greedy Method
• The selection procedure itself is based on some
optimization measure.
• Several optimization measures are plausible for a
given problem.
• Most of them, however, will result in algorithms
that generate sub-optimal solutions.
• This version of greedy technique is called subset
paradigm.
• Some problems like Knapsack, Job sequencing
with deadlines and minimum cost spanning trees
are based on subset paradigm.
Greedy Method
Algorithm Greedy (a, n)
1. // a(1 : n) contains the “n‟ inputs
2. {
3. solution := Ø; // initialize the solution to empty
4. for i:=1 to n do
5. {
6. x := select (i);
7. if feasible (solution, x) then
8. solution := Union (Solution, x);
9. }
10. return solution;
11. }
Greedy Method
• The function select selects an input from “a”,
removes it and assigns its value to “x”.
• Feasible is a Boolean valued function, which
determines if “x” can be included into the
solution vector.
• The function Union combines “x” with
solution and updates the objective function.
Knapsack Problem
• The knapsack problem or rucksack (bag) problem is a problem
in combinatorial optimization:
• Given a set of items, each with a mass and a value, determine the
number of each item to include in a collection so that the total
weight is less than or equal to a given limit and the total value is
as large as possible
Knapsack Problem
There are two versions of the problems
1. 0/1 knapsack problem
2. Fractional Knapsack problem
a. Bounded Knapsack problem.
b. Unbounded Knapsack problem.
Solutions to knapsack problems
➢ Brute-force approach:-Solve the problem with a straight forward
algorithm
➢ Greedy Algorithm:- Keep taking most valuable items until maximum
weight is reached or taking the largest value of each item by
calculating vi=valuei/Sizei
➢ Dynamic Programming:- Solve each sub problem once and store
their solutions in an array.
Fractional Knapsack Problem
• We are given “n” objects and a knapsack.
• The object “i” has a weight wi and the knapsack
has a capacity “m”.
• If a fraction xi, 0 < xi < 1 of object i is placed into
the knapsack then a profit of pi xi is earned.
• The objective is to fill the knapsack that
maximizes the total profit earned.
• Since the knapsack capacity is “m”, we require
the total weight of all chosen objects to be at most
“m”.
Fractional Knapsack Problem
• The problem can be stated as follows

• Maximize

• Subject to
• where 0 <= xi <= 1 and 1 <= i <= n
Fractional Knapsack Problem
• Algorithm fractional-knapsack (w, v, W)
1. for i =1 to n
2. do x[i] =0
3. weight = 0
4. while (weight < W)
5. do i = best remaining item
6. if (weight + w[i] ≤ W)
7. then x[i] = 1
8. weight = weight + w[i]
9. else
10. x[i] = (W - weight) / w[i]
11. weight = W
12. return x
Knapsack-Example

Example
Find out the optimal solution for the Knapsack problem
using the following data. The umber of items: n = 7, W =
15, p(1:7) = (10,5,15,7,6,18,3) and
w(1:7)=(2,3,5,7,1,4,1).
Solution:
•The total number of Items n = 7
•The capacity of the Knapsack W = 15
•To find the solution for the given Knapsack problem
first we need to find out the value per weight for each
item.
Knapsack-Example
• The value per weight for the given 7 items
have been shown in below table.

Items 1 2 3 4 5 6 7
pi 10 5 15 7 6 18 3
wi 2 3 5 7 1 4 1
pi/wi 5 1.6 3 1 6 4.5 3

• Now sort the items in decreasing order


according to their value per weight.
Knapsack-Example
Items 5 1 6 3 7 2 4
pi 6 10 18 15 3 5 7
wi 1 2 4 5 1 3 7
pi/wi 6 5 4.5 3 3 1.6 1

• Now the solution vector will be obtained using


the algorithm fractional-knapsack.
Knapsack-Example
for i =1 to 7
do x[i] =0
x[1] x[2] x[3] x[4] x[5] x[6] x[7]
0 0 0 0 0 0 0

weight = 0
while (weight < W) while (0 < 15) True
do i = best remaining item i=5
if (weight + w[i] ≤ W) if (0 + 1 <= 15) True
then x[i] = 1 x [5] = 1
weight = weight + w[i] weight = 0 + 1 = 1
Knapsack-Example
x[1] x[2] x[3] x[4] x[5] x[6] x[7]
0 0 0 0 1 0 0

while (weight < W) while (1 < 15) True


do i = best remaining item i=1
if (weight + w[i] ≤ W) if (1 + 2 <= 15) True
then x[i] = 1 x [1] = 1
weight = weight + w[i] weight = 1 + 2 = 3
The updated solution vector will be
x[1] x[2] x[3] x[4] x[5] x[6] x[7]
1 0 0 0 1 0 0
Knapsack-Example
while (weight < W) while (3 < 15) True
do i = best remaining item i=6
if (weight + w[i] ≤ W) if (3 + 4 <= 15) True
then x[i] = 1 x [6] = 1
weight = weight + w[i] weight = 3 + 4 = 7

• The updated solution vector will be


x[1] x[2] x[3] x[4] x[5] x[6] x[7]
1 0 0 0 1 1 0
Knapsack-Example
while (weight < W) while (7 < 15) True
do i = best remaining item i=3
if (weight + w[i] ≤ W) if (7 + 5 <= 15) True
then x[i] = 1 x [3] = 1
weight = weight + w[i] weight = 7 + 5 = 12

• The updated solution vector will be


x[1] x[2] x[3] x[4] x[5] x[6] x[7]
1 0 1 0 1 1 0
Knapsack-Example
while (weight < W) while (12 < 15) True
do i = best remaining item i=7
if (weight + w[i] ≤ W) if (12 + 1 <= 15) True
then x[i] = 1 x [7] = 1
weight = weight + w[i] weight = 12 + 1 = 13

• The updated solution vector will be


x[1] x[2] x[3] x[4] x[5] x[6] x[7]
1 0 1 0 1 1 1
Knapsack-Example
while (weight < W) while (13 < 15) True
do i = best remaining item i=2
if (weight + w[i] ≤ W) if (13 + 3 <= 15) False
x[i] = (W - weight) / w[i] x [2] = (15 – 13) / 3 = 2/3
weight = W weight = 15

• The updated solution vector will be


x[1] x[2] x[3] x[4] x[5] x[6] x[7]
1 2/3 1 0 1 1 1
Knapsack-Example
while (weight < W) while (15 < 15) False
return x

• The Solution vector is x[i] = (1,2/3,1,0,1,1,1)


• The total weight is =
= 2*1 + 3*(2/3) + 5*1 + 7*0 + 1*1 + 4*1 + 1*1
= 15
• The total profit is =
= 10*1 + 5*(2/3) + 15*1 + 7*0 + 6*1 + 18*1 + 3*1
= 10 + 3.33 + 15 + 0 + 6 + 18 + 3
= 55.33
Knapsack Problem
• Example #1
– Find out the optimal solution for the Knapsack problem using
the following data. The umber of items: n=3, W=20, (p1,p2,p3) =
(25,24,15), (w1,w2,w3) = (18,15,10).
Knapsack Problem
– Time complexity
• Sorting: O(n log n) using fast sorting algorithm like merge
sort
• GreedyKnapsack: O(n)
• So, total time is O(n log n)

• Proving technique
Compare the greedy solution with any optimal solution. If
the two solutions differ, then find the first xi at which
they differ.
Next, it is shown how to make the xi in the optimal
solution equal to that in the greedy solution without
any loss in total value.
Repeated use of this transformation shows that the greedy
solution is optimal.
Knapsack-Example
Job Sequencing With Deadlines
The problem is stated as below.
• There are n jobs to be processed on a machine.
• Each job i has a deadline di≥ 0 and profit pi≥0 .
• pi is earned iff the job is completed by its deadline.
• The job is completed if it is processed on a machine
for unit time.
• Only one machine is available for processing jobs.
• Only one job is processed at a time on the machine.

25
Job Sequencing With Deadlines
• A feasible solution is a subset of jobs J such
that each job should completed by its deadline.
• An optimal solution is a feasible solution with
maximum profit value.

• Optimal Solution =  Pi
iJ
Maximize
Job Sequencing With Deadlines-Example
• Let n = 4, Profits (P1, P2, P3, P4,) = (100, 10, 15, 27) and
Deadlines (d1 d2 d3 d4) = (2, 1, 2, 1).

Begin Profit= P = 0
iJ
i J=Ǿ

Max D=2
Job J 1 2 3 4
On Machine Two slots Available Dead Line D 2 1 2 1
i.e Max of J=2 only
Profit P 100 10 15 27

1 2
1 2 Job J=4 1 2 3 4
Dead Line D 2 1 2 1
Profit P 100 10 15 27

S. No Feasible Solution Procuring Value


J Sequence
Optimal
1 (1) J=1 1 100
solution
2 (2) 2 10

3 (3) 3 15

4 (4) 4 27
4 1
5 (1,2) J=2 (2,1) 10+100=110
6 (1,3) (1,3) or (3,1) 100+15=115
J(1,4) is the optimal solution
7 (1,4) (4,1) 27+100=127
value is 27+100=127.
Order of job is:
8 (2,3) (2,3) 10+15=25 J4 is followed by Job 1

9 (3,4) (4,3) 27+15=42


Job Sequencing With Deadlines

1. The optimization measure for determining the next


job to be selected in to the solution is according to
the profit.
2. The next job to include is that which increases ∑pi
the most, subject to the constraint that the resulting
“J” is the feasible solution.
3. Therefore the greedy strategy is to consider the
jobs in decreasing order of profits.
4. We must formulate an optimization measure to
determine how the next job is chosen.
Job Sequencing With Deadlines
JS Procedure

1. Sort all jobs in decreasing order of profit.


2. Initialise the result sequence as first job in sorted jobs, place in
slot(end of dead line) i.e greatest possible time slot {J}
3. If slot is allotted for other job, then place previous slot, If no
such i exists, then ignore the job
4. Do following for n-1 jobs.
5. If the current job ft in the current result sequence without
missing deadline, add current job to the result, else ignore the
current job.
Job Sequencing With Deadlines Example
Let n=5, {P1, P2, P3, P4, P 5} = {20, 15, 10, 5, 3} & {d1, d2, d3, d4, d5} = {2, 2, 1, 3, 3}

Arrange The Profits Pi In Descending Order, Along With Corresponding Deadlines.

Jobs J 1 2 3 4 5 Max deadline is 3


Only 3 slots available on machine
Profit 20 15 10 5 3
i.e [0-1],[1-2],[2-3]
Deadline 2 2 1 3 3

0-1 1-2 2-3


Slots
Job Sequencing With Deadlines Example
Let n=5, {P1, P2, P3, P4, P 5} = {20, 15, 10, 5, 3} & {d1, d2, d3, d4, d5} = {2, 2, 1, 3, 3}

0-1 1-2 2-3 Jobs J 1 2 3 4 5


Profit 20 15 10 5 3

Solution Deadline 2 2 1 3 3

Jobs Assigned Job Action Profit


J slots Considered
0-1 1-2 2-3
∅ none 1 Assign to [1,2] 0(started)
{1} [1,2] 2 Assign to [0,1] 20
{1,2} [0,1],[1,2] 3 Can’t fit 35 0-1 1-2 2-3
rejected
{1,2} [0,1],[1,2] 4 Assign to [2,3] 35
{1,2,4} [0,1],[1,2],[2,3] 5 reject 40 Profit 0-1 1-2 2-3

The Optimal solution is J={1,2,4} with profit 40


Job Sequencing With Deadlines Example
Let n=5, {P1, P2, P3, P4, P 5} = {20, 15, 10, 5, 3} & {d1, d2, d3, d4, d5} = {2, 2, 1, 3, 3}

Step-1: Create an array J [] which stores the jobs. Jobs J 1 2 3 4 5


Initially J[] will be
Profit 20 15 10 5 3
Jobs J 1 2 3 4 5
Deadline 2 2 1 3 3
Initial 0 0 0 0 0

Step-2: add ith job in J[] at index denoted by D[i], i.e P1=20 and D1=2, Placed J[2]=(1,2) slot

Jobs J 1 2 3 4 5
Initial 0 P1 0 0 0

Step-3: Next P2 with deadline =2, but J[2] full, then check empty location < J[2], J[1]is empty.
insert P2 at (0,1) slot

Jobs J 1 2 3 4 5
Initial P2 P1 0 0 0
Job Sequencing With Deadlines Example
Let n=5, {P1, P2, P3, P4, P 5} = {20, 15, 10, 5, 3} & {d1, d2, d3, d4, d5} = {2, 2, 1, 3, 3}

Step-4: P3 with deadline 1, but J[1] is full. Discard P3 Jobs J 1 2 3 4 5


Profit 20 15 10 5 3
Deadline 2 2 1 3 3
Step-5: P4 with deadline 3, J[3] is free. Insert P4 at (2,3) slot

Jobs J 1 2 3 4 5
Initial P2 P1 P4 0 0 J{1, 2, 4}
Step-6: P5 with deadline 3, occupied and <=J[3] non empty So, Discard

Jobs J 1 2 3 4 5
Initial P2 P1 P4 0 0

Sequence with optimal solution is P2-P1-P4: 15+20+5=40


What is the solution generated by the function JS when n = 7,
P [1 : 7 ] =(3,5,20,18,1,6,30) and D[1:7 ] = (1,3,4,3,2,1,2)
Arrange the profits Pi in descending order, along with corresponding Deadlines.

Jobs J=7 1 2 3 4 5 6 7
Max deadline is 4
Profit 3 5 20 18 1 6 30 Only 4 slots available on machine
Deadline 1 3 4 3 2 1 2 i.e [0-1],[1-2],[2-3], [3-4]

Sorting
Slots=4

Jobs J=7 7 3 4 6 2 1 5 0-1 1-2 2-3 3-4


Profit 30 20 18 6 5 3 1
Deadline 2 4 3 1 3 1 2
Step-1:start adding the job with J = 0 and P =0
iJ
i

Jobs J=7 7 3 4 6 2 1 5
Profit 30 20 18 6 5 3 1
Deadline 2 4 3 1 3 1 2

Step 2: Job 7 is added to J as it has the 0-1 1-2 2-3 3-4


largest profit and thus J = {7} is a feasible J{7}
one, at (1,2) slot J7

Step 3: Now job 3 is considered. The 0-1 1-2 2-3 3-4


solution J = {3, 7} is a feasible one with J{3,7} J7 J3
processing sequence (7, 3)., at (3,4) slot
Jobs J=7 7 3 4 6 2 1 5
Profit 30 20 18 6 5 3 1
Deadline 2 4 3 1 3 1 2

Step 4: The next job, Job 4 is considered.


0-1 1-2 2-3 3-4
The solution J = {3, 4, 7} is a feasible one
with processing sequence (3,4,7), at (2,3)
J{3, 4,7} J7 J4 J3
slot

Step 5: Finally, the next job, Job 6 is


0-1 1-2 2-3 3-4
considered. The solution J = {3, 4, 6, 7} is
a feasible one with processing sequence
J{3, 4,6,7}
J6 J7 J4 J3
(3,4, 6,7). at (0,1) slot

J = {3,4, 6, 7} is optimal solution with value 30+20+18+6=74.


Let n = 5 and J={a,b,c,d,e} P [1 : 5 ] =(100,19,27,25,15) and D [1:5 ] = (2,1,2,1,3)

Jobs J=5 a b c d e
Max deadline is 3
Profit 100 19 27 25 15 Only 3 slots available on machine
Deadline 2 1 2 1 3 i.e [0-1],[1-2],[2-3]

Sorting
Slots=3

Jobs J=5 a c d b e 0-1 1-2 2-3


Profit 100 27 25 19 15
Deadline 2 2 1 1 3
Step-1: Start adding the job with J = 0 and P = 0
iJ
i

Jobs J=5 a c d b e
Profit 100 27 25 19 15
Deadline 2 2 1 1 3

Step 2: Job ‘a’ is added to J as it has the 0-1 1-2 2-3


largest profit and thus J = {a} is a feasible one, J{a}
at (1,2) slot a

Step 3: Now job ‘c’ is considered. The 0-1 1-2 2-3


solution J = {a,c} is a feasible one at (0,1) J{a ,c} c a
slot
Jobs J=5 a c d b e
Profit 100 27 25 19 15
Deadline 2 2 1 1 3

Step 4: The next job, Job ‘d’ is considered. Slot is not available so
rejected

Step 5: The next job, Job ‘b’ is considered. Slot is not available so
rejected

Step 6: Finally, the next job, Job ‘e’ is


considered. The solution J = {a,c,e} is a
feasible one with processing sequence
J{a ,c,e} 0-1 1-2 2-3
(c,a,e). at (2,3) slot c a e

J = {c,a,e} is optimal solution with value 100+27+15=142.


Difference Between Divide and
Conquer and Greedy Method

You might also like