[go: up one dir, main page]

0% found this document useful (0 votes)
19 views2 pages

task7GradedLab2 Mahul Sethi

This document compares three types of genetic algorithms: steady-state genetic algorithm (SSGA), generational genetic algorithm (SGA), and messy genetic algorithm (MGA). It summarizes the key differences between them in terms of population management, generation replacement, convergence properties, efficiency, and applications. It then provides Python code to implement genetic algorithm for prediction using a genetic algorithm. The code generates training and test data, defines fitness function, runs genetic algorithm over multiple generations, and outputs the best predictor found and its predictions on test data.

Uploaded by

Mahul Sethi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views2 pages

task7GradedLab2 Mahul Sethi

This document compares three types of genetic algorithms: steady-state genetic algorithm (SSGA), generational genetic algorithm (SGA), and messy genetic algorithm (MGA). It summarizes the key differences between them in terms of population management, generation replacement, convergence properties, efficiency, and applications. It then provides Python code to implement genetic algorithm for prediction using a genetic algorithm. The code generates training and test data, defines fitness function, runs genetic algorithm over multiple generations, and outputs the best predictor found and its predictions on test data.

Uploaded by

Mahul Sethi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Mahul Sethi 1000014223 A(P2)

CSF343 EC - Graded Lab2

7(a). Perform the comparison between various types of Genetic algorithms.

Characteristic SGA SSGA MGA


Population Fixed population Constant population Dynamic population
Management size size size
Generation Full population Partial population Dynamic
Replacement replacement replacement introduction and
removal
Convergence Prone to premature Less prone to Robust in dynamic
convergence premature and noisy
convergence landscapes
Efficiency Efficient in Balances efficiency May require more
resources and stability computational
resources
Applications Static optimization Real-time and online Complex and
problems scenarios unpredictable
environments

7(b) WAP in Python to perform the prediction using a Genetic Algorithm


import numpy as np
import random

def target_function(x):
return 2 * x**2 + 3 * x + 4

def generate_data(num_points):
X = np.linspace(0, 10, num_points)
y = np.array([target_function(x) + random.uniform(-1, 1) for x in X])
return X, y

def create_individual():
return [random.uniform(-5, 5), random.uniform(-5, 5), random.uniform(-5, 5)]

def fitness(chromosome, data):


a, b, c = chromosome
predictions = [a * x**2 + b * x + c for x in data[0]]
error = sum((predictions[i] - data[1][i])**2 for i in range(len(data[0])))
return 1 / (1 + error)

population_size = 100
num_generations = 50
Mahul Sethi 1000014223 A(P2)

mutation_rate = 0.1

data = generate_data(50)

population = [create_individual() for _ in range(population_size)]

for generation in range(num_generations):


fitness_scores = [fitness(individual, data) for individual in population]

num_parents = int(population_size * 0.2)


parents = np.array(population)[np.argsort(fitness_scores)[-num_parents:]].tolist()

children = []
while len(children) < population_size - num_parents:
parent1, parent2 = random.choices(parents, k=2)
crossover_point = random.randint(1, len(parent1) - 1)
child = parent1[:crossover_point] + parent2[crossover_point:]
if random.random() < mutation_rate:
mutation_gene = random.randint(0, len(child) - 1)
child[mutation_gene] = random.uniform(-5, 5)
children.append(child)

population = parents + children

best_predictor = max(population, key=lambda x: fitness(x, data))

a, b, c = best_predictor
print(f"Best Predictor: {a:.2f}x^2 + {b:.2f}x + {c:.2f}")

test_data = generate_data(10)
predictions = [a * x**2 + b * x + c for x in test_data[0]]

print("\nTest Data Predictions:")


for i in range(len(test_data[0])):
print(f"Input: {test_data[0][i]:.2f}, Prediction: {predictions[i]:.2f}, Actual:
{test_data[1][i]:.2f}")

You might also like