8000 Replace callback_generation by on_generation · evdcush/GeneticAlgorithmPython@18d70b6 · GitHub
[go: up one dir, main page]

Skip to content

Commit 18d70b6

Browse files
authored
Replace callback_generation by on_generation
The parametetr on_generation is used instead of callback_generation.
1 parent fcb886f commit 18d70b6

File tree

2 files changed

+35
-36
lines changed

2 files changed

+35
-36
lines changed

docs/source/README_pygad_gacnn_ReadTheDocs.rst

Lines changed: 27 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -215,8 +215,8 @@ elements in the output array must range from 0 to 4 inclusive.
215215
Generally, the class labels start from ``0`` to ``N-1`` where ``N`` is
216216
the number of classes.
217217

218-
Note that the project only supports classification problems where each
219-
sample is assigned to only one class.
218+
Note that the project only supports that each sample is assigned to only
219+
one class.
220220

221221
.. _header-n89:
222222

@@ -396,11 +396,10 @@ predicting the outputs based on the current solution's
396396
attribute is updated by weights evolved by the genetic algorithm after
397397
each generation.
398398

399-
PyGAD 2.0.0 and higher has a new parameter accepted by the ``pygad.GA``
400-
class constructor named ``callback_generation``. It could be assigned to
401-
a function that is called after each generation. The function must
402-
accept a single parameter representing the instance of the ``pygad.GA``
403-
class.
399+
PyGAD has a parameter accepted by the ``pygad.GA`` class constructor
400+
named ``on_generation``. It could be assigned to a function that is
401+
called after each generation. The function must accept a single
402+
parameter representing the instance of the ``pygad.GA`` class.
404403

405404
This callback function can be used to update the ``trained_weights``
406405
attribute of layers of each network in the population.
@@ -470,7 +469,7 @@ number of generations is 10.
470469
crossover_type=crossover_type,
471470
mutation_type=mutation_type,
472471
keep_parents=keep_parents,
473-
callback_generation=callback_generation)
472+
on_generation=callback_generation)
474473
475474
The last step for training the neural networks using the genetic
476475
algorithm is calling the ``run()`` method.
@@ -618,12 +617,13 @@ complete code is listed below.
618617
def callback_generation(ga_instance):
619618
global GACNN_instance, last_fitness
620619
621-
population_matrices = gacnn.population_as_matrices(population_networks=GACNN_instance.population_networks,
620+
population_matrices = pygad.gacnn.population_as_matrices(population_networks=GACNN_instance.population_networks,
622621
population_vectors=ga_instance.population)
623622
624623
GACNN_instance.update_population_trained_weights(population_trained_weights=population_matrices)
625624
626625
print("Generation = {generation}".format(generation=ga_instance.generations_completed))
626+
print("Fitness = {fitness}".format(fitness=ga_instance.best_solutions_fitness))
627627
628628
data_inputs = numpy.load("dataset_inputs.npy")
629629
data_outputs = numpy.load("dataset_outputs.npy")
@@ -634,35 +634,35 @@ complete code is listed below.
634634
data_inputs = data_inputs
635635
data_outputs = data_outputs
636636
637-
input_layer = cnn.Input2D(input_shape=sample_shape)
638-
conv_layer1 = cnn.Conv2D(num_filters=2,
639-
kernel_size=3,
640-
previous_layer=input_layer,
641-
activation_function="relu")
642-
average_pooling_layer = cnn.AveragePooling2D(pool_size=5,
643-
previous_layer=conv_layer1,
644-
stride=3)
637+
input_layer = pygad.cnn.Input2D(input_shape=sample_shape)
638+
conv_layer1 = pygad.cnn.Conv2D(num_filters=2,
639+
kernel_size=3,
640+
previous_layer=input_layer,
641+
activation_function="relu")
642+
average_pooling_layer = pygad.cnn.AveragePooling2D(pool_size=5,
643+
previous_layer=conv_layer1,
644+
stride= 8000 3)
645645
646-
flatten_layer = cnn.Flatten(previous_layer=average_pooling_layer)
647-
dense_layer2 = cnn.Dense(num_neurons=num_classes,
648-
previous_layer=flatten_layer,
649-
activation_function="softmax")
646+
flatten_layer = pygad.cnn.Flatten(previous_layer=average_pooling_layer)
647+
dense_layer2 = pygad.cnn.Dense(num_neurons=num_classes,
648+
previous_layer=flatten_layer,
649+
activation_function="softmax")
650650
651-
model = cnn.Model(last_layer=dense_layer2,
652-
epochs=1,
653-
learning_rate=0.01)
651+
model = pygad.cnn.Model(last_layer=dense_layer2,
652+
epochs=1,
653+
learning_rate=0.01)
654654
655655
model.summary()
656656
657657
658-
GACNN_instance = gacnn.GACNN(model=model,
658+
GACNN_instance = pygad.gacnn.GACNN(model=model,
659659
num_solutions=4)
660660
661661
# GACNN_instance.update_population_trained_weights(population_trained_weights=population_matrices)
662662
663663
# population does not hold the numerical weights of the network instead it holds a list of references to each last layer of each network (i.e. solution) in the population. A solution or a network can be used interchangeably.
664664
# If there is a population with 3 solutions (i.e. networks), then the population is a list with 3 elements. Each element is a reference to the last layer of each network. Using such a reference, all details of the network can be accessed.
665-
population_vectors = gacnn.population_as_vectors(population_networks=GACNN_instance.population_networks)
665+
population_vectors = pygad.gacnn.population_as_vectors(population_networks=GACNN_instance.population_networks)
666666
667667
# To prepare the initial population, there are 2 ways:
668668
# 1) Prepare it yourself and pass it to the initial_population parameter. This way is useful when the user wants to start the genetic algorithm with a custom initial population.
@@ -692,7 +692,7 @@ complete code is listed below.
692692
crossover_type=crossover_type,
693693
mutation_type=mutation_type,
694694
keep_parents=keep_parents,
695-
callback_generation=callback_generation)
695+
on_generation=callback_generation)
696696
697697
ga_instance.run()
698698

docs/source/README_pygad_gann_ReadTheDocs.rst

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -444,10 +444,9 @@ attribute is updated by weights evolved by the genetic algorithm after
444444
each generation.
445445

446446
PyGAD 2.0.0 and higher has a new parameter accepted by the ``pygad.GA``
447-
class constructor named ``callback_generation``. It could be assigned to
448-
a function that is called after each generation. The function must
449-
accept a single parameter representing the instance of the ``pygad.GA``
450-
class.
447+
class constructor named ``on_generation``. It could be assigned to a
448+
function that is called after each generation. The function must accept
449+
a single parameter representing the instance of the ``pygad.GA`` class.
451450

452451
This callback function can be used to update the ``trained_weights``
453452
attribute of layers of each network in the population.
@@ -521,7 +520,7 @@ Here is an example.
521520
crossover_type=crossover_type,
522521
mutation_type=mutation_type,
523522
keep_parents=keep_parents,
524-
callback_generation=callback_generation)
523+
on_generation=callback_generation)
525524
526525
The last step for training the neural networks using the genetic
527526
algorithm is calling the ``run()`` method.
@@ -763,7 +762,7 @@ its complete code is listed below.
763762
crossover_type=crossover_type,
764763
mutation_type=mutation_type,
765764
keep_parents=keep_parents,
766-
callback_generation=callback_generation)
765+
on_generation=callback_generation)
767766
768767
ga_instance.run()
769768
@@ -915,7 +914,7 @@ according to the ``num_neurons_output`` parameter of the
915914
crossover_type=crossover_type,
916915
mutation_type=mutation_type,
917916
keep_parents=keep_parents,
918-
callback_generation=callback_generation)
917+
on_generation=callback_generation)
919918
920919
ga_instance.run()
921920
@@ -1094,7 +1093,7 @@ for regression.
10941093
crossover_type=crossover_type,
10951094
mutation_type=mutation_type,
10961095
keep_parents=keep_parents,
1097-
callback_generation=callback_generation)
1096+
on_generation=callback_generation)
10981097
10991098
ga_instance.run()
11001099
@@ -1262,7 +1261,7 @@ Here is the complete code.
12621261
crossover_type=crossover_type,
12631262
mutation_type=mutation_type,
12641263
keep_parents=keep_parents,
1265-
callback_generation=callback_generation)
1264+
on_generation=callback_generation)
12661265
12671266
ga_instance.run()
12681267

0 commit comments

Comments
 (0)
0