[go: up one dir, main page]

0% found this document useful (0 votes)
10 views11 pages

Ada 2

The document provides a comprehensive overview of various algorithms including Heapsort, Hash Tables, Red-Black Trees, and Backtracking. It details their respective algorithms, pseudocode, examples, and time and space complexities. Additionally, it covers specific backtracking problems such as the 8 Queens Problem, Graph Coloring, and Hamiltonian Cycle, illustrating their approaches and complexities.

Uploaded by

khushidayma1979
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views11 pages

Ada 2

The document provides a comprehensive overview of various algorithms including Heapsort, Hash Tables, Red-Black Trees, and Backtracking. It details their respective algorithms, pseudocode, examples, and time and space complexities. Additionally, it covers specific backtracking problems such as the 8 Queens Problem, Graph Coloring, and Hamiltonian Cycle, illustrating their approaches and complexities.

Uploaded by

khushidayma1979
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

ADA 150 Marks Answers with Diagrams

and Examples

1. Heapsort
Introduction
Heapsort is a comparison-based sorting technique based on a Binary Heap data structure. It's an in-place sorting
algorithm with a time complexity of O(n log n) for all cases.

Algorithm
1.​ Build a max heap from the input data.​

2.​ At this point, the largest item is stored at the root of the heap.​

3.​ Replace it with the last item of the heap followed by reducing the size of the heap by one.​

4.​ Heapify the root.​

5.​ Repeat the process until the heap is empty.​

Pseudocode
●​ HEAPSORT(arr):
●​ BUILD_MAX_HEAP(arr)
●​ for i = n-1 to 1:
●​ swap arr[0] and arr[i]
●​ heap_size = heap_size - 1
●​ MAX_HEAPIFY(arr, 0)

Dry Run Example


Input: [4, 10, 3, 5, 1]​
Heapify: [10, 5, 3, 4, 1]​
Swap root and last: [1, 5, 3, 4, 10] → heapify → [5, 4, 3, 1, 10] → ...​
Sorted: [1, 3, 4, 5, 10]

Diagram
●​ Initial Binary Heap:
●​ 10
●​ / \
●​ 5 3
●​ /\
●​ 4 1

Time & Space Complexity


●​ Best, Worst, Average: O(n log n)​

●​ Space: O(1)​

2. Hash Tables
Introduction
Hash tables store key-value pairs using a hash function to compute an index into an array of buckets.

Operations
●​ Insert​

●​ Delete​

●​ Search​

Pseudocode
●​ INSERT(key, value):
●​ index = hash(key)
●​ table[index] = value

Dry Run Example


Keys: ["cat", "bat", "rat"] → hash("cat") = 2 → store at index 2.

Diagram
●​ Index | Value
●​ ------|-------
●​ 0 | None
●​ 1 | rat
●​ 2 | cat
●​ 3 | bat

Time & Space Complexity


●​ Average: O(1) for insert/search/delete​

●​ Worst: O(n) if many collisions​

3. Red-Black Trees
Introduction
A Red-Black Tree is a self-balancing binary search tree where each node has an extra bit for color (red or black).

Properties
1.​ Every node is red or black.​

2.​ Root is black.​

3.​ Red nodes cannot be adjacent.​

4.​ Every path from a node to its descendant NULL nodes must have the same number of black nodes.​

Pseudocode (Insert)
●​ RB_INSERT(tree, node):
●​ Insert node like BST
●​ Fix violations using rotations and recoloring

Diagram
●​ Before Insert:
●​ 10(B)
●​ / \
●​ 5(R) 15(R)
●​
●​ After Insert(1):
●​ 10(R)
●​ / \
●​ 5(B) 15(B)
●​ /
●​ 1(R)

Time & Space Complexity


●​ O(log n) for insert, delete, search​

4. Greedy Method
General Strategy
Construct a solution piece by piece, choosing the locally optimal solution at each step.

A. Knapsack Problem (Fractional)

Problem

Given weights and values of items, choose fractions to maximize value in knapsack of capacity W.

Algorithm
1.​ Calculate value/weight ratio.​

2.​ Sort items by ratio.​

3.​ Pick items greedily.​

Pseudocode
●​ KNAPSACK(w[], v[], W):
●​ Calculate v/w for each item
●​ Sort by v/w descending
●​ for each item:
●​ if weight ≤ W:
●​ take whole item
●​ else:
●​ take fraction
Dry Run
Items: [(60, 10), (100, 20), (120, 30)]​
Capacity: 50​
Pick 100 + 60 + 2/3 of 120 = 240

Diagram
●​ Value/Weight Ratios:
●​ (60/10)=6, (100/20)=5, (120/30)=4
●​ → Sorted: [6, 5, 4]

Time Complexity
●​ O(n log n)

B. Minimum Spanning Tree (MST)

Prim’s Algorithm

1.​ Start with any node.​

2.​ Add the lowest weight edge connecting visited to unvisited.​

Pseudocode
●​ PRIM(graph):
●​ key[] = ∞
●​ key[0] = 0
●​ for i in range(V):
●​ u = min key not in MST
●​ for neighbors of u:
●​ if weight < key[v]:
●​ update key[v]

Diagram
●​ Graph:
●​ A---2---B
●​ | /
●​ 3 1
●​ |/
●​ C
●​ MST: A-B-C with weights 2 + 1
Time Complexity: O(V^2) or O(E log V) with heap

C. Single Source Shortest Path

Dijkstra’s Algorithm

1.​ Initialize distances to ∞​

2.​ Pick node with min distance​

3.​ Update neighbors​

Pseudocode
●​ DIJKSTRA(graph, src):
●​ dist[] = ∞
●​ dist[src] = 0
●​ for each node:
●​ u = node with min dist not visited
●​ for each neighbor v:
●​ if dist[u] + weight < dist[v]:
●​ dist[v] = dist[u] + weight

Diagram
●​ Graph:
●​ A -1- B -2- C
●​ Shortest Path A to C = 3

Time Complexity
●​ O(V^2) or O(E log V)​

Here is the detailed answer on Backtracking in the same format, suitable for 150 marks:

1. Backtracking
Introduction
Backtracking is a general algorithm for finding solutions to computational problems that incrementally build
candidates for solutions, and abandon those candidates ("backtrack") as soon as it is determined they cannot be
extended to a valid solution.

General Method
Backtracking is a depth-first search algorithm where we build the solution step by step, and if we reach a point where
we can't proceed further, we backtrack to the previous step to try a different possibility.

1.​ Choose: Select a choice from the available options.​

2.​ Explore: Move forward by including this choice.​

3.​ Check Feasibility: If the current path is valid, explore further. If not, backtrack.​

4.​ Un-choose: Remove the last chosen option and try another possibility.​

Algorithm
1.​ Define a recursive function.​

2.​ Start by trying the first available choice.​

3.​ If the choice leads to a solution, return it.​

4.​ If not, undo the last choice and try the next possibility.​

Pseudocode
BACKTRACKING(solution):
if solution is complete:
return solution
for each candidate in available choices:
if candidate is valid:
add candidate to solution
result = BACKTRACKING(solution)
if result is valid:
return result
remove candidate from solution
return no solution

Time & Space Complexity


●​ Time complexity: It varies depending on the problem but is typically exponential.​

●​ Space complexity: O(N) for storing the recursive stack.​

2. 8 Queens Problem
Problem Statement
Place 8 queens on a chessboard such that no two queens can attack each other. This means no two queens should
share the same row, column, or diagonal.

Approach
Backtracking is used to place queens row by row. In each row, we try to place the queen in a column where it is not
attacked by other queens.

Pseudocode
PLACE_QUEENS(board, row):
if row == 8:
print(board)
return
for col in range(8):
if is_safe(board, row, col):
board[row][col] = 1
PLACE_QUEENS(board, row + 1)
board[row][col] = 0

Dry Run Example


●​ Initially, an empty 8x8 board is set up.​

●​ At each row, we try placing a queen in each column, checking for validity (no other queens can attack).​

●​ Once a solution is found, it’s printed.​

●​ The process backtracks to the previous rows if no valid placements are possible.​

Diagram
Step-by-step placements:
Row 0 → [0, 1, 0, 0, 0, 0, 0, 0]
Row 1 → [0, 0, 0, 1, 0, 0, 0, 0]
...
Solution (One of the possible outputs):
[1, 0, 0, 0, 0, 0, 0, 0]
[0, 0, 0, 0, 1, 0, 0, 0]
...

Time & Space Complexity


●​ Time complexity: O(N!), since for each row, you try placing the queen in every column.​

●​ Space complexity: O(N) for the recursive stack.​

3. Graph Coloring
Problem Statement
Graph coloring is the assignment of labels (colors) to the vertices of a graph such that no two adjacent vertices share
the same color.

Approach
We assign colors to each vertex one by one. If the assigned color is not valid (i.e., adjacent vertices have the same
color), we backtrack and try another color.

Pseudocode
GRAPH_COLORING(graph, colors, vertex):
if all vertices are colored:
return True
for color in available_colors:
if is_valid(graph, vertex, color):
colors[vertex] = color
if GRAPH_COLORING(graph, colors, vertex + 1):
return True
colors[vertex] = None
return False

Dry Run Example


Graph: A - B - C - D​
Available colors: 1, 2, 3
1.​ Start coloring vertex A with color 1.​

2.​ Move to vertex B, assign color 2.​

3.​ Move to vertex C, assign color 1.​

4.​ Continue until all vertices are colored or we backtrack.​

Diagram
Graph Coloring for 4 vertices:
A-B
| |
C-D

Possible color assignment (one solution):


A→1
B→2
C→1
D→2

Time & Space Complexity


●​ Time complexity: O(C^V) where C is the number of colors, and V is the number of vertices.​

●​ Space complexity: O(V) for storing the color assignment.​

4. Hamiltonian Cycle
Problem Statement
A Hamiltonian cycle is a cycle in an undirected graph that visits each vertex exactly once and returns to the starting
vertex.

Approach
Try to construct the cycle step by step by choosing vertices to add to the cycle, and backtrack when the cycle is not
possible.

Pseudocode
HAM_CYCLE(graph, path, pos):
if pos == V:
if graph[path[pos - 1]][path[0]] == 1:
return True
return False
for v in range(1, V):
if is_safe(graph, path, pos, v):
path[pos] = v
if HAM_CYCLE(graph, path, pos + 1):
return True
path[pos] = -1
return False

Dry Run Example


●​ Given a graph with vertices [0, 1, 2, 3], check if a Hamiltonian cycle exists.​

●​ Try adding each vertex to the path and check if it completes the cycle.​

●​ Backtrack if the current path doesn't lead to a valid solution.​

Diagram
Graph (adjacency matrix):
0-1
| |
2-3

Hamiltonian Cycle: 0 → 1 → 3 → 2 → 0

Time & Space Complexity


●​ Time complexity: O(V!) for the Hamiltonian cycle problem (since it tries every possible permutation).​

●​ Space complexity: O(V) for storing the path.​

Conclusion
Backtracking is a powerful algorithmic technique for solving combinatorial problems such as the 8-Queens problem,
graph coloring, and Hamiltonian cycles. By trying all possible solutions and backtracking when an invalid solution is
reached, we can efficiently solve many complex problems. However, due to its nature, the time complexity can grow
exponentially, which is why it is suitable for small to medium-sized problems.

You might also like