[go: up one dir, main page]

ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Performance Analysis of Simulated Annealing and Genetic Algorithm on systems of linear equations

[version 1; peer review: 1 approved, 1 not approved]
PUBLISHED 20 Dec 2021
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the Research Synergy Foundation gateway.

Abstract

Problem solving and modelling in traditional substitution methods at large scale for systems using sets of simultaneous equations is time consuming. For such large scale global-optimization problem, Simulated Annealing (SA) algorithm and Genetic Algorithm (GA) as meta-heuristics for random search technique perform faster.
Therefore, this study applies the SA to solve the problem of linear equations and evaluates its performances against Genetic Algorithms (GAs), a population-based search meta-heuristic, which are widely used in Travelling Salesman problems (TSP), Noise reduction and many more. This paper presents comparison between performances of the SA and GA for solving real time scientific problems. The significance of this paper is to solve the certain real time systems with a set of simultaneous linear equations containing different unknown variable samples those were simulated in Matlab using two algorithms-SA and GA. In all of the experiments, the generated random initial solution sets and the random population of solution sets were used in the SA and GA respectively. The comparison and performances of the SA and GA were evaluated for the optimization to take place for providing sets of solutions on certain systems.
The SA algorithm is superior to GA on the basis of experimentation done on the sets of simultaneous equations, with a lower fitness function evaluation count in MATLAB simulation. Since, complex non-linear systems of equations have not been the primary focus of this research, in future, performances of SA and GA using such equations will be addressed. Even though GA maintained a relatively lower number of average generations than SA, SA still managed to outperform GA with a reasonably lower fitness function evaluation count. Although SA sometimes converges slowly, still it is efficient for solving problems of simultaneous equations in this case. In terms of computational complexity, SA was far more superior to GAs.

Keywords

Optimization, Simulated Annealing, Simultaneous Equations, Genetic Algorithms

Introduction

The Simulated Annealing (SA) algorithm is a well-known meta-heuristic for solving problems that require best or optimal results. This algorithm can easily be implemented using some programming language and as such it has gained wide acceptance in the arena of computational and scientific research over the past few decades.

On the other hand, in Science and Engineering, systems of equations or simultaneous equations involving unknown variables are applied to solve real-life problems through mathematical modelling, in solving simultaneous equations, the two most common methods are the Substitution and the Elimination method.1 In one way, Substitution works nicely, especially for a system of two equations with two variables. However, for larger cases, such as a system of three equations with three variables, the Substitution method can become a time-consuming process. On the contrary, the Elimination method can be an easier alternative to solving equations involving three equations with three variables. Still, with an increase in the number of equations and variables, it can become a difficult and lengthy process. Besides, a numerical method called the Gaussian Elimination method is not always capable of finding good quality solutions for complex systems of equations.2 Hence, to be able to solve simultaneous equations using the SA approach could make the process of finding possible solution sets much easier, faster, and convenient.

The purpose of this research is to study the performances of the SA algorithm while comparing it with Genetic Algorithms (GAs), in solving simultaneous linear equations.

Literature review

Genetic Algorithms (GAs) have been used to solve simultaneous equations in the past. Abiodun et al.3 had implemented a GA to solve simultaneous linear equations. They experimented with seven different systems of equations and have performed a comparative analysis of their work with Gaussian Elimination method. They were successful in solving the equations using GA. In some cases, GA was able to give the best or exact solutions, which was not possible with the Gaussian Elimination method. Besides, their GA was able to find other combinations of solution sets which again was not possible with the existing conventional methods like Substitution, Elimination etc. Their work indicated the fact that GA was superior in its performances against the existing conventional methods of solving simultaneous equations.

A review on Simulated Annealing and its applications

Simulated Annealing (SA) is an approach to solving optimization problems introduced by Kirkpatrick et al.4 where it uses the property of randomization in its search operations. The idea of the SA algorithm comes from metallurgy where it mimics the metal’s re-crystallization in the process of slow cooling of metal which is termed annealing.4

In the SA simulation,4 initially, the system undergoing optimization is melted at a high temperature. Next, the temperature is decreased slowly to freeze the system. This continues until no further change is possible. The SA algorithm has the capability to escape from local minima by accepting or rejecting new solution candidates based on a probability function.

The basic SA algorithm4 has the following procedure:

  • 1. T0 is the initial temperature that needs to be specified before the initialization of the solution candidate

  • 2. E is the fitness of the candidate that needs to be evaluated

  • 3. Next, the candidate has to be moved to a neighboring solution randomly

  • 4. E′ is the new solution whose fitness needs to be evaluated

  • 5. In case of E′ ≤ E, the new solution is accepted

  • 6. Whereas, in case of E′ > E, the new solution is accepted based on acceptance probability P, as shown in (1) below

  • 7. T stands for temperature that needs to be decreased.

  • 8. The SA search will stop if T is close to zero. Otherwise, it will go back to step 2

(1)
P=eEE/T

The temperature is regulated by the following:

(2)
Tk+1=λTk,0<λ<1

In equation (2), k is the number of iterations. The symbol λ represents a coefficient that is used to adjust the cooling schedule by modification. The advantage of the SA approach is that it can obtain the global optimum with a greater probability, however, it requires many iterations to converge to a solution.

Some recent applications of SA are discussed below.

Jayabalan et al.5 have proposed two modified versions of SAs to solve a scheduling problem known as permutation flowshop.

In another work, by Cunha and Marques,6 they have developed a trajectory-based SA with newer versions of reannealing and generation techniques that enable better convergence for producing solutions involving diversity and uniformity.

In a recent study,7 an approach called mixed-integer programming is discussed with SA proposed as a solution.

Odziemczyk8 has introduced a procedure to solve a three-dimensional coordinate transformation problem using a variant of the SA technique.

Grabusts et al.9 have experimented with the SA method and Travelling Salesman Problems (TSP) to find the shortest routes between destinations. Their study discussed applications of mathematical models that could be utilized to solve real-life problems.

Gallo and Capozzi10 have used SA to solve a scheduling problem that involves several key parameters like tempering, freezing etc. In one study,11 the authors have used a modified SA procedure to solve a problem related to sample allocation.

In both studies by A. A. N. P. Redi,12,13 SA algorithms have been applied in routing problems. In the first study,12 the objective was to find minimum routes that require less transportation time whereas, in the second study,13 the research aimed to reduce the transportation expenses.

In a most recent study in Engineering optimization, Meng14 tried to discover new parameters pertaining to the creative and cultural domain of product design to improve the performance of the SA algorithm.

Managing tasks in cloud computing environment is huge and daunting and it requires smart handling. Therefore, the study by Fakhrosadat Fanian et al.,15 has contributed towards a better way of managing the scheduling of tasks by developing a new algorithmic approach that takes its inspiration from Firefly and SA algorithms.

Gripon, V et al.16 conducted certain experiments on training weights, which were discrete in nature, using a combination of traditional SA and ideas taken from statistical physics.

P. Aravind et al.17 have used a heuristic method based on SA to overcome the problem of allocating switches within a short time for various networks. Their model was able to solve the problem within a reasonable amount of time for varied network sizes.

Avinash C. Pandey and Dharmveer S. Rajpoot18 have focused on getting accurate results in the classification of features by combining the techniques of SA and grey wolf optimization.

Jie Zhou et al.19 have introduced an efficient way to increase the life span of largescale wireless sensor networks by using a special version of SA called “Elite Adaptive Simulated Annealing Algorithm (EASA)”.

Czerniachowska, K et al.20 have addressed an interesting problem of arranging products on shelves which can be a strenuous task involving retailing decisions. Therefore, they have applied the SA procedure to handle the effective decision making of shelf allocation of products on symmetrical planograms.

In the area of industrial cutting, one study has investigated a problem called multi-objective optimization for irregular objects,21 in the field of processing aquatic products, and have developed a SA technique for the problem of squid cutting.

Methods

In this study, both SA and GA have been implemented in MATLAB (Version: R2017a, RRID: SCR_001622) based on the setup of two algorithms stated in Table 1 and Table 2. MATLAB is software that provides functionality to implement algorithms, analyze data, or create models.

Table 1. Simulated Annealing parameter setup.

SA parameters
Cooling schedule:LinearT=1iterationC+Total No. of iterations
Probability calculation:P=1if Enew<EcurreEnewEcurr/T,otherwise
Total number of iterations: 1000

Table 2. Genetic algorithm parameter setup.

GA parameters
Population size: 100
Reproduction: 60%
Crossover: 30%
Elitism: 10%
Mutation: 0.1
Total number of generations: 1000

Simultaneous equations, in Mathematics, play a very important role in modeling real-life problems. These comprise a set of equations, which is finite in nature and should meet at a common point in space.22 These equations should have functions of at least two variables and can either have a linear or a non-linear characteristic.

According to Gilbert,23 a system of simultaneous equations, linear in nature, can be written in the following manner:

a11x1+a12x2+a1NxN=y1a21x1+a22x2+a2NxN=y2....aN1x1+aN2x2+aNNxN=yN

This equation can be represented using a matrix-vector as follows:

Ax=y

In this expression, unknown variables are contained in a vector x. The number of unknown variables is represented by N, and constants are represented by a, in a coefficient matrix represented by A.

In this study, both SA and GA have been implemented to find the values of unknowns in the equations.

The data is randomly generated to represent each unknown variable in a system of simultaneous equations. The generated data comprises each candidate solution set in SA and each chromosome in GA. In SA, a single vector of values or a solution set is initially generated, while in a GA, a population of 100 chromosomes is initially generated. In evaluating the fitness of each solution set in SA or a chromosome in GA, the randomly generated vector of values is used to evaluate the equation on the left-hand side (LHS). Next, the values of LHS are checked with the values on the right-hand side (RHS) to see if they are equal. If the evaluated value/values of equation/equations on the LHS are equal to the given value/values on the RHS of the equation/equations, then the solution set/chromosome of the unknowns is considered to be valid or fit to a system of equations.

In order to evaluate the fitness of each solution set in SA or a chromosome in GA, an arbitrary value of 3000 is taken as the start fitness value. For each equation or a system of equations, the absolute value of the difference of the LHS and the RHS is taken. Next, it is subtracted from the start fitness value of 3000, i.e.,

(4)
DIFFERENCE=absolute value ofLHSRHS
(5)
FITNESS=FITNESSDIFFERENCE

If the difference is equal to 0, the fitness value remains unchanged (i.e., 3000), otherwise, the fitness decreases. To calculate the energy of each solution set in SA, the fitness value of each solution set is subtracted from this arbitrary value (i.e., 3000), since fitness and energy are opposite to each other in the case of SA. This method of converting fitness to energy using the subtraction method resembles a flip flop operation.

Results

Table 3 above presents the results of SA, solving simultaneous equations. For this study, 10 different systems of equations, involving two, three and four unknown variables, were selected. Each experiment was performed 15 times, to record the average performance of SA. As can be seen in Table 3, for the first three systems of equations, involving two variables (Experiment number (No.) 1, 2 and 3), SA was able to give its best performance by solving them within a minimum of 3, 2 and 3 iterations each, while keeping the maximum average at 73 (Experiment No. 3). Table 1 and Table 2 below show the parameter setup for SA and GA, respectively.

Table 3. Results of SA solving simultaneous equations.

Experiment No.EquationsSA Results: Solution of unknownsNo. of test runsMinimum No. of iterations to find solution (Best result)Average No. of iterations to find solutionAverage No. of fitness function evaluation
1.x1 + 4x2 = 6x1 = 21534141
x1 + 2x2 = 4x2 = 1
2.2x1 + 5x2 = 33x1 = 41526060
x1 + 3x2 = 19x2 = 5
3.3x1 + 2x2 = 36x1 = 81537373
5x1 + 4x2 = 64x2 = 6
4.10x1 + x2 + x3 = 12x1 = 11555757
2x1 + 10x2 + x3 = 13x2 = 1
2x1 + 2x2 + 10x3 = 14x3 = 1
5.2x1 + x2 + 3x3 = 13x1 = 11558080
x1 + 5x2 + x3 = 14x2 = 2
3x1 + x2 + 4x3 = 17x3 = 3
6.x1 + 2x2 + 3x3 = 6x1 = 115288383
2x1 + 4x2 + x3 = 7x2 = 1
3x1 + 2x2 + 9x3 = 14x3 = 1
7.x1 + x2 + x3 = 6x1 = 11515101101
x1 − x2 + x3 = 2x2 = 2
x1 + 2x2 − x3 = 2x3 = 3
8.x1 + 2x2 + 3x3 = 14x1 = 11512120120
x1 + x2 + x3 = 6x2 = 2
3x1 + 2x2 + x3 = 10x3 = 3
x1 = 2
x2 = 0
x3 = 4
x1 = 0
x2 = 4
x3 = 2
9.3x1 + 2x2 − x3 = 1x1 = 11514176176
2x1 − 2x2 + 4x3 = −2x2 = −2
−x1 + 0.5x2 − x3 = 0x3 = −2
10.4x1 + 3x2 + 2x3 + x4 = 10x1 = 11561150150
3x1 + 2x2 + x3 + 4x4 = 6x2 = 0
2x1 + x2 + 4x3 + 3x4 = 14x3 = 3
x1 + 4x2 + 3x3 + 2x4 = 10x4 = 0

Again, with equations involving three variables, the algorithm performed relatively well, by solving them within as few as 5 minimum iterations and 57, 80 and 83 iterations on average, (Experiment No. 4, 5 and 6). However, soon after that, the average number of iterations kept increasing, reaching a maximum of 176 iterations (Experiment No. 9). This increasing rate is perhaps an indication of the pattern complexity of solution sets (usually having a combination of both positive and negative numbers).

Also, it is interesting to note that, in experiment No. 8, SA was able to find three different sets of solutions, therefore, making it an efficient method over the other traditional techniques like substitution and elimination, as discussed earlier. While, in experiment No. 10, involving 4 variables, there was a sudden but usual increase in the minimum number of iterations, reaching a maximum of 61 iterations. However, in the long run, SA managed to keep its pace by maintaining a standard average of 150 iterations, relatively lower than in experiment No. 9.

Overall, the SA algorithm was successful in solving the selected systems of equations within a reasonable number of iterations, maintaining a reasonable average of 176 iterations and 176 times fitness evaluation count. Figures 110 gives a glimpse of the performances of the SA algorithm implemented in MATLAB software, showing the various energy levels at each iteration count, until reaching a level of 0 at the target iteration, to find the expected solutions.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure1.gif

Figure 1. Experiment No. 1, minimum no. of iterations = 3.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure2.gif

Figure 2. Experiment No. 2, minimum no. of iterations = 2.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure3.gif

Figure 3. Experiment No. 3, minimum no. of iterations = 3.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure4.gif

Figure 4. Experiment No. 4, minimum no. of iterations = 5.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure5.gif

Figure 5. Experiment No. 5, minimum no. of iterations = 5.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure6.gif

Figure 6. Experiment No. 6, minimum no. of iterations = 28.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure7.gif

Figure 7. Experiment No. 7, minimum no. of iterations = 15.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure8.gif

Figure 8. Experiment No. 8, minimum no. of iterations = 12.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure9.gif

Figure 9. Experiment No. 9, minimum no. of iterations = 14.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure10.gif

Figure 10. Experiment No. 10, minimum no. of iterations = 61.

Table 4 presents the results of GA solving simultaneous equations. In order to evaluate the performances of SA against GA, the same set of equations from SA were tested with GA. Each experiment was performed 15 times just like in SA.

Table 4. Results of GA solving simultaneous equations.

Experiment No.EquationsGA Results: Solution of unknownsNo. of test runsMinimum No. of generations to find solution (Best result)Average No. of generations to find solutionAverage No. of fitness function evaluation
1.x1 + 4x2 = 6
x1 + 2x2 = 4
x1 = 2
x2 = 1
1512200
2.2x1 + 5x2 = 33
x1 + 3x2 = 19
x1 = 4
x2 = 5
1512200
3.3x1 + 2x2 = 36
5x1 + 4x2 = 64
x1 = 8
x2 = 6
1512200
4.10x1 + x2 + x3 = 12
2x1 + 10x2 + x3 = 13
2x1 + 2x2 + 10x3 = 14
x1 = 1
x2 = 1
x3 = 1
152353500
5.2x1 + x2 + 3x3 = 13
x1 + 5x2 + x3 = 14
3x1 + x2 + 4x3 = 17
x1 = 1
x2 = 2
x3 = 3
152171700
6.x1 + 2x2 + 3x3 = 6
2x1 + 4x2 + x3 = 7
3x1 + 2x2 + 9x3 = 14
x1 = 1
x2 = 1
x3 = 1
151121200
7.x1 + x2 + x3 = 6
x1 − x2 + x3 = 2
x1 + 2x2 − x3 = 2
x1 = 1
x2 = 2
x3 = 3
1519900
8.x1 + 2x2 + 3x3 = 14
x1 + x2 + x3 = 6
3x1 + 2x2 + x3 = 10
x1 = 1
x2 = 2
x3 = 3
x1 = 2
x2 = 0
x3 = 4
x1 = 0
x2 = 4
x3 = 2
1513300
9.3x1 + 2x2 − x3 = 1
2x1 − 2x2 + 4x3 = −2
−x1 + 0.5x2 − x3 = 0
x1 = 1
x2 = −2
x3 = −2
15219619600
10.4x1 + 3x2 + 2x3 + x4 = 10
3x1 + 2x2 + x3 + 4x4 = 6
2x1 + x2 + 4x3 + 3x4 = 14
x1 + 4x2 + 3x3 + 2x4 = 10
x1 = 1
x2 = 0
x3 = 3
x4 = 0
158505000

In the first three experiments (Experiment No. 1, 2 and 3), GA was successful in solving the equations within an average of 2 generations. However, there was a sudden increase in the average no. of generations to 35 in experiment 4 with equations involving 3 variables. Next, in the following experiments from 5 through 8, there was a gradual decline in the number of average generations (i.e., 17, 12, 9 and 3, respectively), indicating the improved performance of the algorithm. In experiment 9, there was a dramatic increase in the average no. of generations reaching a maximum of 196. It is important to note that this sudden increase in the number of generations is consistent with the same experiment performed with SA earlier, where the average no. of iterations reached a maximum of 176 from 120. Perhaps, this increase is an indication of the pattern complexity in solution sets (usually having a combination of both positive and negative numbers), just like in the case of SA. In all the experiments from 1 to 9, the algorithm was able to find solutions within a minimum of 1 or 2 generations, until in experiment 10 (involving equations with 4 unknowns) there was a significant rise in the number of minimum generations to 8 and number of average generations to 50 due to the increase in problem size. However, the algorithm managed to maintain a relatively much lower number of average generations as compared to 196 in experiment 9 which is again consistent with the SA results shown earlier. GA was also able to find three different sets of solutions in experiment 8 just like SA, thus making both the optimization techniques in general, more efficient over the traditional methods like substitution and elimination, as mentioned earlier.

Even though GA was able to solve all the equations just like SA, within a maximum average of 196 generations, its comparison with SA does make a huge difference since the maximum average number of iterations for SA stand at 176, which is lesser than in GA. Moreover, in SA, a single iteration means just 1 solution set, whereas, in GA, a single generation means a whole population of 100 chromosomes or solution sets (taken for this study). Therefore, the average fitness function evaluation for each solution set in the case of SA stands at 176 times, while in GA it stands at 196 × 100 = 19600 times, where the fitness evaluation takes place for the entire population of chromosomes in each generation, thus making it computationally expensive unlike in SA where the fitness evaluation takes place for just one solution set at each iteration.

Figures 11-14 give a glimpse of the performances of the GA algorithm, implemented in MATLAB software, showing the maximum and the average population fitness of each generation until it reaches a maximum fitness of 3000 (a defined arbitrary fitness value) at the target generation and finds the expected solution sets.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure11.gif

Figure 11. Experiment No. 4, minimum no. of generations = 2.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure12.gif

Figure 12. Experiment No. 5, minimum no. of generations = 2.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure13.gif

Figure 13. Experiment No. 9, minimum no. of generations = 2.

15174b1e-f1ea-4545-ab97-a8c102f397dc_figure14.gif

Figure 14. Experiment No. 10, minimum no. of generations = 8.

Discussion

This research successfully applied Simulated Annealing (SA) algorithm and evaluated its performances against Genetic Algorithms (GAs). During this experimental study, the 10 different systems of simultaneous linear equations involving 2, 3 and 4 variables were solved.

In all the experiments, an initial solution set or a population of solution sets were randomly generated in both the SA and the GA, for the optimization to take place.

In the case of SA, the linear cooling schedule was used to control the temperature. The linear cooling schedule was effective in successfully finding all the solutions. Further, slow cooling by adding a constant (C = 50 and C = 100) was also tried in some cases but that did not make a big difference in the number of iterations to find solutions.

In the case of GA, a population size of 100, a reproduction rate of 60%, a crossover of 30% and elitism of 10% were used. The mutation rate was kept at 0.1.

During the experimentation phase, the following observations were made:

  • While solving simultaneous equations, both SA and GA were equally able to solve them and resulted in more than one set of linear solutions for respective systems of those equations. For instance, in experiment 8 (Table 3 and Table 4), for both SA and GA, there have been observed three different sets of perfect solutions, which were a perfect fit on those equations. Thus, indicating the optimization technique to be efficient over the other conventional methods like the Gaussian Elimination Method etc.

  • Even though GA maintained a relatively lower number of average generations than SA, SA still managed to outperform GA with a reasonably lower fitness function evaluation count and thereby proved to be far more computationally efficient than GA.

  • Even though SA converges slowly sometimes, it can still be regarded as an efficient technique for solving problems that require optimization.

  • In terms of computational complexity, the SA algorithm proved to be far more superior to GAs.

Conclusions

This study evaluated the performances of the Simulated Annealing (SA) algorithm over Genetic algorithms (GAs) in solving simultaneous linear equations.

The SA algorithm was applied to solve simultaneous linear equations and evaluated its performances against GAs.

The limitation of this paper is that the problem of the real-time system has been solved with a set of only linear equations instead of both linear and non-linear equations.

However, in future, the focus of this research experiment with SA will be on providing solutions with non-linear systems of equations. Furthermore, as part of future work, Harmony Search24 as a recent optimization technique can be used to solve systems of equations based on music improvisation.

Data availability

No data are associated with this article

Ethical approval and consent

No humans or animals were included in this study.

Author contribution

Md. Shabiul Islam: Conceptualization, Supervision, Writing-Review & Editing

Most Tahamina Khatoon: Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Resources, Software, Validation, Visualization, Writing-Original Draft

Kazy Noor-e-Alam Siddiquee: Conceptualization, Formal Analysis, Methodology, Project Administration, Writing-Original Draft Preparation, Writing-Review & Editing

Wong Hin Yong: Writing-Review & Editing

Mohammad Nurul Huda: Supervision, Writing-Review & Editing

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 20 Dec 2021
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Islam MS, Khatoon MT, Siddiquee KNeA et al. Performance Analysis of Simulated Annealing and Genetic Algorithm on systems of linear equations [version 1; peer review: 1 approved, 1 not approved]. F1000Research 2021, 10:1297 (https://doi.org/10.12688/f1000research.73581.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 20 Dec 2021
Views
24
Cite
Reviewer Report 14 Jan 2022
Andranik S. Akopov, Department of Business Informatics, HSE University, Moscow, Russian Federation 
Not Approved
VIEWS 24
The problem statement described in the submitted paper does not make sense. Genetic Algorithms are applied to solve more complex optimization problems (e.g., non-convex, multiobjective optimization, etc.). Thus, the proposed method looks like an application of complex heuristic algorithms to ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Akopov AS. Reviewer Report For: Performance Analysis of Simulated Annealing and Genetic Algorithm on systems of linear equations [version 1; peer review: 1 approved, 1 not approved]. F1000Research 2021, 10:1297 (https://doi.org/10.5256/f1000research.77242.r116261)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 17 Jan 2022
    KAZY NOOR E ALAM SIDDIQUEE, Department of Computer Science and Engineering, University of Science and Technology Chittagong, Foy’s Lake, Post Box-1079, Chattogram-4202, Bangladesh
    17 Jan 2022
    Author Response
    Dear Reviewer
    Thank you very much for your formatted review. 
    The work is primarily focusing on linear systems equation prioritizing 10 equations as a sample. Other variants of SA are ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 17 Jan 2022
    KAZY NOOR E ALAM SIDDIQUEE, Department of Computer Science and Engineering, University of Science and Technology Chittagong, Foy’s Lake, Post Box-1079, Chattogram-4202, Bangladesh
    17 Jan 2022
    Author Response
    Dear Reviewer
    Thank you very much for your formatted review. 
    The work is primarily focusing on linear systems equation prioritizing 10 equations as a sample. Other variants of SA are ... Continue reading
Views
20
Cite
Reviewer Report 14 Jan 2022
Aloke Kumar Saha, University of Asia Pacific, Dhaka, Bangladesh 
Approved
VIEWS 20
This is a good work. However there are some observations from my end:

1. Authors mentioned that no data are associated with this article, but they provided some examples. I think the number of examples should be ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Saha AK. Reviewer Report For: Performance Analysis of Simulated Annealing and Genetic Algorithm on systems of linear equations [version 1; peer review: 1 approved, 1 not approved]. F1000Research 2021, 10:1297 (https://doi.org/10.5256/f1000research.77242.r116271)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 20 Dec 2021
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.