Keywords
Optimization, Simulated Annealing, Simultaneous Equations, Genetic Algorithms
This article is included in the Research Synergy Foundation gateway.
Optimization, Simulated Annealing, Simultaneous Equations, Genetic Algorithms
The Simulated Annealing (SA) algorithm is a well-known meta-heuristic for solving problems that require best or optimal results. This algorithm can easily be implemented using some programming language and as such it has gained wide acceptance in the arena of computational and scientific research over the past few decades.
On the other hand, in Science and Engineering, systems of equations or simultaneous equations involving unknown variables are applied to solve real-life problems through mathematical modelling, in solving simultaneous equations, the two most common methods are the Substitution and the Elimination method.1 In one way, Substitution works nicely, especially for a system of two equations with two variables. However, for larger cases, such as a system of three equations with three variables, the Substitution method can become a time-consuming process. On the contrary, the Elimination method can be an easier alternative to solving equations involving three equations with three variables. Still, with an increase in the number of equations and variables, it can become a difficult and lengthy process. Besides, a numerical method called the Gaussian Elimination method is not always capable of finding good quality solutions for complex systems of equations.2 Hence, to be able to solve simultaneous equations using the SA approach could make the process of finding possible solution sets much easier, faster, and convenient.
The purpose of this research is to study the performances of the SA algorithm while comparing it with Genetic Algorithms (GAs), in solving simultaneous linear equations.
Genetic Algorithms (GAs) have been used to solve simultaneous equations in the past. Abiodun et al.3 had implemented a GA to solve simultaneous linear equations. They experimented with seven different systems of equations and have performed a comparative analysis of their work with Gaussian Elimination method. They were successful in solving the equations using GA. In some cases, GA was able to give the best or exact solutions, which was not possible with the Gaussian Elimination method. Besides, their GA was able to find other combinations of solution sets which again was not possible with the existing conventional methods like Substitution, Elimination etc. Their work indicated the fact that GA was superior in its performances against the existing conventional methods of solving simultaneous equations.
Simulated Annealing (SA) is an approach to solving optimization problems introduced by Kirkpatrick et al.4 where it uses the property of randomization in its search operations. The idea of the SA algorithm comes from metallurgy where it mimics the metal’s re-crystallization in the process of slow cooling of metal which is termed annealing.4
In the SA simulation,4 initially, the system undergoing optimization is melted at a high temperature. Next, the temperature is decreased slowly to freeze the system. This continues until no further change is possible. The SA algorithm has the capability to escape from local minima by accepting or rejecting new solution candidates based on a probability function.
The basic SA algorithm4 has the following procedure:
1. T0 is the initial temperature that needs to be specified before the initialization of the solution candidate
2. E is the fitness of the candidate that needs to be evaluated
3. Next, the candidate has to be moved to a neighboring solution randomly
4. E′ is the new solution whose fitness needs to be evaluated
5. In case of E′ ≤ E, the new solution is accepted
6. Whereas, in case of E′ > E, the new solution is accepted based on acceptance probability P, as shown in (1) below
7. T stands for temperature that needs to be decreased.
8. The SA search will stop if T is close to zero. Otherwise, it will go back to step 2
The temperature is regulated by the following:
In equation (2), k is the number of iterations. The symbol λ represents a coefficient that is used to adjust the cooling schedule by modification. The advantage of the SA approach is that it can obtain the global optimum with a greater probability, however, it requires many iterations to converge to a solution.
Some recent applications of SA are discussed below.
Jayabalan et al.5 have proposed two modified versions of SAs to solve a scheduling problem known as permutation flowshop.
In another work, by Cunha and Marques,6 they have developed a trajectory-based SA with newer versions of reannealing and generation techniques that enable better convergence for producing solutions involving diversity and uniformity.
In a recent study,7 an approach called mixed-integer programming is discussed with SA proposed as a solution.
Odziemczyk8 has introduced a procedure to solve a three-dimensional coordinate transformation problem using a variant of the SA technique.
Grabusts et al.9 have experimented with the SA method and Travelling Salesman Problems (TSP) to find the shortest routes between destinations. Their study discussed applications of mathematical models that could be utilized to solve real-life problems.
Gallo and Capozzi10 have used SA to solve a scheduling problem that involves several key parameters like tempering, freezing etc. In one study,11 the authors have used a modified SA procedure to solve a problem related to sample allocation.
In both studies by A. A. N. P. Redi,12,13 SA algorithms have been applied in routing problems. In the first study,12 the objective was to find minimum routes that require less transportation time whereas, in the second study,13 the research aimed to reduce the transportation expenses.
In a most recent study in Engineering optimization, Meng14 tried to discover new parameters pertaining to the creative and cultural domain of product design to improve the performance of the SA algorithm.
Managing tasks in cloud computing environment is huge and daunting and it requires smart handling. Therefore, the study by Fakhrosadat Fanian et al.,15 has contributed towards a better way of managing the scheduling of tasks by developing a new algorithmic approach that takes its inspiration from Firefly and SA algorithms.
Gripon, V et al.16 conducted certain experiments on training weights, which were discrete in nature, using a combination of traditional SA and ideas taken from statistical physics.
P. Aravind et al.17 have used a heuristic method based on SA to overcome the problem of allocating switches within a short time for various networks. Their model was able to solve the problem within a reasonable amount of time for varied network sizes.
Avinash C. Pandey and Dharmveer S. Rajpoot18 have focused on getting accurate results in the classification of features by combining the techniques of SA and grey wolf optimization.
Jie Zhou et al.19 have introduced an efficient way to increase the life span of largescale wireless sensor networks by using a special version of SA called “Elite Adaptive Simulated Annealing Algorithm (EASA)”.
Czerniachowska, K et al.20 have addressed an interesting problem of arranging products on shelves which can be a strenuous task involving retailing decisions. Therefore, they have applied the SA procedure to handle the effective decision making of shelf allocation of products on symmetrical planograms.
In the area of industrial cutting, one study has investigated a problem called multi-objective optimization for irregular objects,21 in the field of processing aquatic products, and have developed a SA technique for the problem of squid cutting.
In this study, both SA and GA have been implemented in MATLAB (Version: R2017a, RRID: SCR_001622) based on the setup of two algorithms stated in Table 1 and Table 2. MATLAB is software that provides functionality to implement algorithms, analyze data, or create models.
GA parameters |
---|
Population size: 100 |
Reproduction: 60% |
Crossover: 30% |
Elitism: 10% |
Mutation: 0.1 |
Total number of generations: 1000 |
Simultaneous equations, in Mathematics, play a very important role in modeling real-life problems. These comprise a set of equations, which is finite in nature and should meet at a common point in space.22 These equations should have functions of at least two variables and can either have a linear or a non-linear characteristic.
According to Gilbert,23 a system of simultaneous equations, linear in nature, can be written in the following manner:
This equation can be represented using a matrix-vector as follows:
In this expression, unknown variables are contained in a vector x. The number of unknown variables is represented by N, and constants are represented by a, in a coefficient matrix represented by A.
In this study, both SA and GA have been implemented to find the values of unknowns in the equations.
The data is randomly generated to represent each unknown variable in a system of simultaneous equations. The generated data comprises each candidate solution set in SA and each chromosome in GA. In SA, a single vector of values or a solution set is initially generated, while in a GA, a population of 100 chromosomes is initially generated. In evaluating the fitness of each solution set in SA or a chromosome in GA, the randomly generated vector of values is used to evaluate the equation on the left-hand side (LHS). Next, the values of LHS are checked with the values on the right-hand side (RHS) to see if they are equal. If the evaluated value/values of equation/equations on the LHS are equal to the given value/values on the RHS of the equation/equations, then the solution set/chromosome of the unknowns is considered to be valid or fit to a system of equations.
In order to evaluate the fitness of each solution set in SA or a chromosome in GA, an arbitrary value of 3000 is taken as the start fitness value. For each equation or a system of equations, the absolute value of the difference of the LHS and the RHS is taken. Next, it is subtracted from the start fitness value of 3000, i.e.,
If the difference is equal to 0, the fitness value remains unchanged (i.e., 3000), otherwise, the fitness decreases. To calculate the energy of each solution set in SA, the fitness value of each solution set is subtracted from this arbitrary value (i.e., 3000), since fitness and energy are opposite to each other in the case of SA. This method of converting fitness to energy using the subtraction method resembles a flip flop operation.
Table 3 above presents the results of SA, solving simultaneous equations. For this study, 10 different systems of equations, involving two, three and four unknown variables, were selected. Each experiment was performed 15 times, to record the average performance of SA. As can be seen in Table 3, for the first three systems of equations, involving two variables (Experiment number (No.) 1, 2 and 3), SA was able to give its best performance by solving them within a minimum of 3, 2 and 3 iterations each, while keeping the maximum average at 73 (Experiment No. 3). Table 1 and Table 2 below show the parameter setup for SA and GA, respectively.
Again, with equations involving three variables, the algorithm performed relatively well, by solving them within as few as 5 minimum iterations and 57, 80 and 83 iterations on average, (Experiment No. 4, 5 and 6). However, soon after that, the average number of iterations kept increasing, reaching a maximum of 176 iterations (Experiment No. 9). This increasing rate is perhaps an indication of the pattern complexity of solution sets (usually having a combination of both positive and negative numbers).
Also, it is interesting to note that, in experiment No. 8, SA was able to find three different sets of solutions, therefore, making it an efficient method over the other traditional techniques like substitution and elimination, as discussed earlier. While, in experiment No. 10, involving 4 variables, there was a sudden but usual increase in the minimum number of iterations, reaching a maximum of 61 iterations. However, in the long run, SA managed to keep its pace by maintaining a standard average of 150 iterations, relatively lower than in experiment No. 9.
Overall, the SA algorithm was successful in solving the selected systems of equations within a reasonable number of iterations, maintaining a reasonable average of 176 iterations and 176 times fitness evaluation count. Figures 1–10 gives a glimpse of the performances of the SA algorithm implemented in MATLAB software, showing the various energy levels at each iteration count, until reaching a level of 0 at the target iteration, to find the expected solutions.
Table 4 presents the results of GA solving simultaneous equations. In order to evaluate the performances of SA against GA, the same set of equations from SA were tested with GA. Each experiment was performed 15 times just like in SA.
In the first three experiments (Experiment No. 1, 2 and 3), GA was successful in solving the equations within an average of 2 generations. However, there was a sudden increase in the average no. of generations to 35 in experiment 4 with equations involving 3 variables. Next, in the following experiments from 5 through 8, there was a gradual decline in the number of average generations (i.e., 17, 12, 9 and 3, respectively), indicating the improved performance of the algorithm. In experiment 9, there was a dramatic increase in the average no. of generations reaching a maximum of 196. It is important to note that this sudden increase in the number of generations is consistent with the same experiment performed with SA earlier, where the average no. of iterations reached a maximum of 176 from 120. Perhaps, this increase is an indication of the pattern complexity in solution sets (usually having a combination of both positive and negative numbers), just like in the case of SA. In all the experiments from 1 to 9, the algorithm was able to find solutions within a minimum of 1 or 2 generations, until in experiment 10 (involving equations with 4 unknowns) there was a significant rise in the number of minimum generations to 8 and number of average generations to 50 due to the increase in problem size. However, the algorithm managed to maintain a relatively much lower number of average generations as compared to 196 in experiment 9 which is again consistent with the SA results shown earlier. GA was also able to find three different sets of solutions in experiment 8 just like SA, thus making both the optimization techniques in general, more efficient over the traditional methods like substitution and elimination, as mentioned earlier.
Even though GA was able to solve all the equations just like SA, within a maximum average of 196 generations, its comparison with SA does make a huge difference since the maximum average number of iterations for SA stand at 176, which is lesser than in GA. Moreover, in SA, a single iteration means just 1 solution set, whereas, in GA, a single generation means a whole population of 100 chromosomes or solution sets (taken for this study). Therefore, the average fitness function evaluation for each solution set in the case of SA stands at 176 times, while in GA it stands at 196 × 100 = 19600 times, where the fitness evaluation takes place for the entire population of chromosomes in each generation, thus making it computationally expensive unlike in SA where the fitness evaluation takes place for just one solution set at each iteration.
Figures 11-14 give a glimpse of the performances of the GA algorithm, implemented in MATLAB software, showing the maximum and the average population fitness of each generation until it reaches a maximum fitness of 3000 (a defined arbitrary fitness value) at the target generation and finds the expected solution sets.
This research successfully applied Simulated Annealing (SA) algorithm and evaluated its performances against Genetic Algorithms (GAs). During this experimental study, the 10 different systems of simultaneous linear equations involving 2, 3 and 4 variables were solved.
In all the experiments, an initial solution set or a population of solution sets were randomly generated in both the SA and the GA, for the optimization to take place.
In the case of SA, the linear cooling schedule was used to control the temperature. The linear cooling schedule was effective in successfully finding all the solutions. Further, slow cooling by adding a constant (C = 50 and C = 100) was also tried in some cases but that did not make a big difference in the number of iterations to find solutions.
In the case of GA, a population size of 100, a reproduction rate of 60%, a crossover of 30% and elitism of 10% were used. The mutation rate was kept at 0.1.
During the experimentation phase, the following observations were made:
• While solving simultaneous equations, both SA and GA were equally able to solve them and resulted in more than one set of linear solutions for respective systems of those equations. For instance, in experiment 8 (Table 3 and Table 4), for both SA and GA, there have been observed three different sets of perfect solutions, which were a perfect fit on those equations. Thus, indicating the optimization technique to be efficient over the other conventional methods like the Gaussian Elimination Method etc.
• Even though GA maintained a relatively lower number of average generations than SA, SA still managed to outperform GA with a reasonably lower fitness function evaluation count and thereby proved to be far more computationally efficient than GA.
• Even though SA converges slowly sometimes, it can still be regarded as an efficient technique for solving problems that require optimization.
• In terms of computational complexity, the SA algorithm proved to be far more superior to GAs.
This study evaluated the performances of the Simulated Annealing (SA) algorithm over Genetic algorithms (GAs) in solving simultaneous linear equations.
The SA algorithm was applied to solve simultaneous linear equations and evaluated its performances against GAs.
The limitation of this paper is that the problem of the real-time system has been solved with a set of only linear equations instead of both linear and non-linear equations.
However, in future, the focus of this research experiment with SA will be on providing solutions with non-linear systems of equations. Furthermore, as part of future work, Harmony Search24 as a recent optimization technique can be used to solve systems of equations based on music improvisation.
Md. Shabiul Islam: Conceptualization, Supervision, Writing-Review & Editing
Most Tahamina Khatoon: Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Resources, Software, Validation, Visualization, Writing-Original Draft
Kazy Noor-e-Alam Siddiquee: Conceptualization, Formal Analysis, Methodology, Project Administration, Writing-Original Draft Preparation, Writing-Review & Editing
Wong Hin Yong: Writing-Review & Editing
Mohammad Nurul Huda: Supervision, Writing-Review & Editing
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the work clearly and accurately presented and does it cite the current literature?
No
Is the study design appropriate and is the work technically sound?
No
Are sufficient details of methods and analysis provided to allow replication by others?
Partly
If applicable, is the statistical analysis and its interpretation appropriate?
No
Are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions drawn adequately supported by the results?
Partly
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Genetic Algorithms, Neural Networks, and Deep Learning, Fuzzy Logic, Artificial Intelligence, Multiagent systems
Is the work clearly and accurately presented and does it cite the current literature?
Yes
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
Yes
If applicable, is the statistical analysis and its interpretation appropriate?
Yes
Are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions drawn adequately supported by the results?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Algorithms, Machine Learning, Natural Language Processing, Deep Learning.
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 1 20 Dec 21 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)