[go: up one dir, main page]

0% found this document useful (0 votes)
31 views16 pages

Chapter 2 Final

Uploaded by

danugokul850
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views16 pages

Chapter 2 Final

Uploaded by

danugokul850
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

CHAPTER 2

Greedy Method: General method, knapsack problem, job sequencing with


deadlines, minimum spanning trees, single source paths and analysis of these
problems.

TOPIC: GREEDY APPROACH

The greedy method is an algorithmic approach used to solve optimization


problems. It builds a solution incrementally by making a sequence of choices,
each of which is locally optimal. The hope is that these local choices lead to a
globally optimal solution. This approach is simple and efficient for many
problems, but it does not always guarantee an optimal solution.

Characteristics of the Greedy Method:

1. Greedy Choice Property: A globally optimal solution can be arrived at by making


locally optimal choices.
2. Optimal Substructure: A problem has an optimal substructure if an optimal solution
can be constructed efficiently from optimal solutions of its subproblems.

Steps in the Greedy Algorithm:

1. Initialization: Start with an empty or initial solution.


2. Selection: Choose the best option available at each step according to a specific
criterion (local optimization).
3. Feasibility Check: Ensure the chosen option is feasible (e.g., satisfies constraints).
4. Incorporation: Add the chosen option to the current solution.
5. Repeat: Continue this process until the entire problem is solved.

Examples of Problems Solved Using Greedy Algorithms:

1. Activity Selection Problem:


o Select the maximum number of activities that don't overlap in time.
o Greedy criterion: Choose the activity that ends the earliest.
2. Knapsack Problem (Fractional):
o Fill a knapsack with items to maximize value without exceeding weight
capacity.
o Greedy criterion: Choose items with the highest value-to-weight ratio.
3. Huffman Coding:
o Used in data compression.
o Greedy criterion: Build a prefix tree by repeatedly merging the two smallest
frequency nodes.
4. Minimum Spanning Tree (MST):
o Algorithms like Kruskal’s and Prim’s use a greedy approach to find the MST
of a graph.
5. Dijkstra’s Algorithm:
o Finds the shortest path in a weighted graph.
o Greedy criterion: Choose the vertex with the smallest known distance.

Advantages:

 Efficiency: Greedy algorithms are often faster and simpler compared to other
methods like dynamic programming.
 Practicality: They are easy to implement for many real-world problems.

Limitations:

 Non-optimal Solutions: Not all problems are amenable to greedy strategies (e.g., 0/1
Knapsack Problem).
 Problem-Specific: Requires the problem to have the greedy-choice property and
optimal substructure.

TOPIC: KNAPSACK PROBLEM

The Knapsack Problem is a classic optimization problem in which the goal is to maximize
the value of items placed in a knapsack while adhering to its weight capacity. There are two
primary variations of the problem:

1. Fractional Knapsack Problem (solvable using a greedy algorithm)


2. 0/1 Knapsack Problem (solvable using dynamic programming)

1. Fractional Knapsack Problem

In this variation, you can take fractions of items, meaning you can divide an item to fit it into
the knapsack.

Problem Statement:

 You are given n items,36 each with a weight wi_ and a value vi.
 The knapsack has a weight capacity W.
 Maximize the total value of items in the knapsack while not exceeding the capacity W.

Approach:

A greedy algorithm is used:

1. Calculate the value-to-weight ratio for each item: ratio=vi/wi.


2. Sort the items in decreasing order of their value-to-weight ratio.
3. Select items one by one in the sorted order:
o If the current item's weight fits in the knapsack, take it entirely.
o If the current item's weight exceeds the remaining capacity, take the fraction of the
item that fits.
4. Stop when the knapsack is full.

Time Complexity:

 Sorting items: O(nlogn)


 Selection of items: O(n)
 Total: O(nlogn)

NUMERICAL

TOPIC: JOB SEQUENCING WITH DEADLINE


Given an array of jobs where every job has a deadline and associated profit if the job is
finished before the deadline.It is also given that every job takes single unit of time, so the
minimum possible deadline for any job is 1. How to maximize total profit if only one job can
be scheduled at a time.
ALGO:
1) Sort all jobs in decreasing order of profit.
2) Initialize the result sequence as first job in sorted jobs.
3) Do following for remaining n-1 jobs
.......a) If the current job can fit in the current result sequence
without missing the deadline, add current job to the result.
Else ignore the current job.
EXAMPLE

Example
Let us consider a set of given jobs as shown in the following table. We have to find a
sequence of jobs, which will be completed within their deadlines and will give maximum
profit. Each job is associated with a deadline and profit.

Job J1 J2 J3 J4 J5

Deadline 2 1 3 2 1

Profit 60 100 20 40 20

Solution
To solve this problem, the given jobs are sorted according to their profit in a descending
order. Hence, after sorting, the jobs are ordered as shown in the following table.

Job J2 J1 J4 J3 J5

Deadline 1 2 2 3 1

Profit 100 60 40 20 20

From this set of jobs, first we select J2, as it can be completed within its deadline and
contributes maximum profit.

 Next, J1 is selected as it gives more profit compared to J4.


 In the next clock, J4 cannot be selected as its deadline is over, hence J3 is selected as
it executes within its deadline.
 The job J5 is discarded as it cannot be executed within its deadline.
Thus, the solution is the sequence of jobs (J2, J1, J3), which are being executed within their
deadline and gives maximum profit.

Total profit of this sequence is 100 + 60 + 20 = 180.

Analysis
In this algorithm, we are using two loops, one is within another. Hence, the complexity of
this algorithm is O(n2).

TOPIC: MINIMUM SPANNING TREE

The Minimum Spanning Tree (MST) is a subgraph of a weighted undirected


graph that connects all the vertices with the minimum possible total edge
weight, without forming any cycles.

Characteristics of MST:
1. Spanning Tree: It is a tree that includes all the vertices of the graph with
the minimum number of edges (V - 1, where V is the number of
vertices).
2. Weight Minimization: The total weight of the edges in the MST is
minimized.
3. No Cycles: An MST is a tree and cannot contain any cycles.

SPANNING TREE

 A spanning tree is a sub-graph of an undirected connected graph, which


includes all the vertices of the graph with a minimum possible number of
edges. If a vertex is missed, then it is not a spanning tree.
 The edges may or may not have weights assigned to them.

 The total number of spanning trees with n vertices that can be created
from a complete graph is equal to n
(n-2)
.
 If we have n = 4, the maximum number of possible spanning trees is
equal to 44-2 = 16. Thus, 16 spanning trees can be formed from a
complete graph with 4 vertices.
Example of a Spanning Tree

Let's understand the spanning tree with examples below:

Let the original graph be:

Normal graph

Some of the possible spanning trees that can be created from the above graph are:

A spanning tree A spanning tree

A spanning tree A spanning tree

A spanning tree A spanning tree


Minimum Spanning Tree

A minimum spanning tree is a spanning tree in which the sum of the weight of the edges is as
minimum as possible.

Example of a Spanning Tree

Let's understand the above definition with the help of the example below.

The initial graph is:

Weighted graph

The possible spanning trees from the above graph are:

Minimum spanning tree - 1 Minimum spanning tree - 2 Minimum spanning tree - 3 Minimum spanning tree - 4

The minimum spanning tree from the above spanning trees is:

Minimum spanning tree


MST Algorithms:

Two commonly used algorithms to find an MST are:

1. Kruskal’s Algorithm (Greedy Approach):


 Sort all edges in non-decreasing order of their weights.
 Use a union-find data structure to add edges to the MST while ensuring no cycles are
formed.

Steps:

1. Sort all edges by weight.


2. Initialize MST as an empty set.
3. For each edge in sorted order:
o If adding the edge does not form a cycle, add it to the MST.
4. Stop when the MST contains V−1 edges.

Below is the illustration of the above approach:

Input Graph:
The graph contains 9 vertices and 14 edges. So, the minimum spanning tree formed will be
having (9 – 1) = 8 edges.
After sorting:

Weight Source Destination

1 7 6

2 8 2

2 6 5

4 0 1

4 2 5

6 8 6

7 2 3

7 7 8

8 0 7

8 1 2

9 3 4

10 5 4

11 1 7

14 3 5

Now pick all edges one by one from the sorted list of edges

Step 1: Pick edge 7-6. No cycle is formed, include it.


Add edge 7-6 in the MST

Step 2: Pick edge 8-2. No cycle is formed, include it.

Add edge 8-2 in the MST

Step 3: Pick edge 6-5. No cycle is formed, include it.

Add edge 6-5 in the MST


Step 4: Pick edge 0-1. No cycle is formed, include it.

Add edge 0-1 in the MST

Step 5: Pick edge 2-5. No cycle is formed, include it.

Add edge 2-5 in the MST

Step 6: Pick edges 8-6. Since including this edge results in the cycle, discard it. Pick edge 2-
3: No cycle is formed, include it.
Add edge 2-3 in the MST

Step 7: Pick edge 7-8. Since including this edge results in the cycle, discard it. Pick edge 0-7.
No cycle is formed, include it.

Add edge 0-7 in MST

Step 8: Pick edge 1-2. Since including this edge results in the cycle, discard it. Pick edge 3-4.
No cycle is formed, include it.

Add edge 3-4 in the MST

Note: Since the number of edges included in the MST equals to (V – 1), so the algorithm
stops here

2. Prim’s Algorithm (Greedy Approach):

 Start from an arbitrary vertex and grow the MST one edge at a time by adding the smallest
edge that connects a vertex in the MST to a vertex outside it.

Steps:
1. Initialize all vertices as not visited.
2. Choose any vertex as the starting point and mark it as visited.
3. Add the smallest edge that connects a visited vertex to an unvisited vertex.
4. Repeat until all vertices are included in the MST.

TOPIC: SINGLE SOURCE SHORTEST PATH

The Single Source Shortest Path (SSSP) problem aims to find the shortest paths from a
given source vertex to all other vertices in a weighted graph. The graph can be directed or
undirected.
Common Algorithms for SSSP:

1. Dijkstra's Algorithm (For graphs with non-negative weights)


2. Bellman-Ford Algorithm (Handles graphs with negative weights)
3. Floyd-Warshall Algorithm (All-pairs shortest paths, but can solve SSSP for dense graphs)

1. Dijkstra’s Algorithm

 Type: Greedy
 Input: Weighted graph with non-negative weights.
 Output: Shortest distance from the source to all other vertices.

Steps:

1. Initialize the distance of the source vertex to itself as 0 and all others as infinity (∞\infty).
2. Use a priority queue (min-heap) to keep track of the vertices with the smallest known
distance.
3. Extract the vertex with the smallest distance, mark it as visited, and update the distances of
its neighbors.
4. Repeat until all vertices are processed or the queue is empty.

Time Complexity:

 Using a priority queue (min-heap): O((V+E)logV)


 V: Number of vertices, E: Number of edges.

Here are 5 key points about Dijkstra's Algorithm:

1. Purpose

 Dijkstra's Algorithm is used to find the shortest path from a single source vertex to all other
vertices in a graph.
 It works for graphs with non-negative edge weights only.

2. Approach

 It is a greedy algorithm, which means it selects the shortest (locally optimal) path at each
step with the hope of finding the globally optimal solution.
3. Steps

 Initialization: Start with the source vertex, set its distance to 00, and all others to ∞\infty
(infinity).
 Priority Queue: Use a min-heap to efficiently get the next vertex with the smallest known
distance.
 Relaxation: For each vertex, update the distances of its neighbors if a shorter path is found
through the current vertex.

4. Time Complexity

 Using a priority queue (min-heap):


o O((V+E)log⁡V)O((V + E) \log V), where VV is the number of vertices, and EE is the
number of edges.
 Suitable for sparse graphs with a large number of vertices.

5. Limitations

 Does not work with graphs containing negative weight edges, as the greedy approach
cannot guarantee correct results.
 For graphs with negative weights, Bellman-Ford Algorithm is used instead.
COMPLEXITY CHART

You might also like