[go: up one dir, main page]

0% found this document useful (0 votes)
14 views26 pages

Sand Cat Optimization

The document presents a new metaheuristic algorithm called Sand Cat Swarm Optimization (SCSO), inspired by the hunting behavior of sand cats, which effectively balances exploration and exploitation phases for solving global optimization problems. The SCSO algorithm was tested against 20 benchmark functions and seven engineering design problems, outperforming other well-known metaheuristic algorithms in finding optimal solutions. Key features of SCSO include fewer parameters, ease of implementation, and the ability to avoid local optima traps.

Uploaded by

huy le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views26 pages

Sand Cat Optimization

The document presents a new metaheuristic algorithm called Sand Cat Swarm Optimization (SCSO), inspired by the hunting behavior of sand cats, which effectively balances exploration and exploitation phases for solving global optimization problems. The SCSO algorithm was tested against 20 benchmark functions and seven engineering design problems, outperforming other well-known metaheuristic algorithms in finding optimal solutions. Key features of SCSO include fewer parameters, ease of implementation, and the ability to avoid local optima traps.

Uploaded by

huy le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/359877495

Sand Cat swarm optimization: a nature-inspired algorithm to solve global


optimization problems

Article in Engineering with Computers · April 2022


DOI: 10.1007/s00366-022-01604-x

CITATIONS READS

291 6,088

2 authors:

Amir Seyyedabbasi Farzad Kiani


Istinye Universitesi 79 PUBLICATIONS 1,864 CITATIONS
41 PUBLICATIONS 1,179 CITATIONS
SEE PROFILE
SEE PROFILE

All content following this page was uploaded by Farzad Kiani on 12 April 2022.

The user has requested enhancement of the downloaded file.


Engineering with Computers
https://doi.org/10.1007/s00366-022-01604-x

ORIGINAL ARTICLE

Sand Cat swarm optimization: a nature‑inspired algorithm to solve


global optimization problems
Amir Seyyedabbasi1 · Farzad Kiani1

Received: 20 February 2021 / Accepted: 10 January 2022


© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022

Abstract
This study proposes a new metaheuristic algorithm called sand cat swarm optimization (SCSO) which mimics the sand cat
behavior that tries to survive in nature. These cats are able to detect low frequencies below 2 kHz and also have an incred-
ible ability to dig for prey. The proposed algorithm, inspired by these two features, consists of two main phases (search
and attack). This algorithm controls the transitions in the exploration and exploitation phases in a balanced manner and
performed well in finding good solutions with fewer parameters and operations. It is carried out by finding the direction and
speed of the appropriate movements with the defined adaptive strategy. The SCSO algorithm is tested with 20 well-known
along with modern 10 complex test functions of CEC2019 benchmark functions and the obtained results are also compared
with famous metaheuristic algorithms. According to the results, the algorithm that found the best solution in 63.3% of the
test functions is SCSO. Moreover, the SCSO algorithm is applied to seven challenging engineering design problems such as
welded beam design, tension/compression spring design, pressure vessel design, piston lever, speed reducer design, three-bar
truss design, and cantilever beam design. The obtained results show that the SCSO performs successfully on convergence
rate and in locating all or most of the local/global optima and outperforms other compared methods.

Keywords Metaheuristics · Sand cat swarm optimization · Swarm intelligence · Optimization

1 Introduction Polynomial time (NP-hard) problems. The NP-hard prob-


lems can be solved by exact and approximate methods. The
In real-life, optimization means minimizing time and cost exact methods guarantee the optimal solution with exponen-
besides maximizing efficiency and quality. In fact, there are tial time complexity and high cost. Both the heuristic and
many problems that need to be optimized, which are com- metaheuristic algorithms are the family of approximate algo-
plex and hard. The heuristic and metaheuristic algorithms rithms, that do not guarantee to find the optimal solution, but
are two methods for solving problems and getting optimized the obtained solution may be nearer to optimal solution with
solutions. The heuristic algorithms are problem-dependent, better complexity and actual execution time. Therefore, the
while metaheuristic algorithms are not dependent on a second group is generally preferred. On the other hand, con-
particular problem, and the optimum solution is obtained sidering that heuristic methods have problems such as local
from the random search space and predefined boundary traps, the metaheuristic approach can provide a practical and
[1]. The larger and more complex a problem is, the more useful solution for many NP-hard problems and generally
difficult it becomes to solve [2]. Especially their time and performs well in finding optimal solutions to these types of
memory complexities can deteriorate significantly. The most problems in actual execution time [3].
prominent of these types of problems are Nondeterministic The metaheuristic methods do not fall into local traps,
in addition, they are more flexible and try to find the best
* Amir Seyyedabbasi solutions in global search areas with fewer process costs in
amir.seyyedabbasi@gmail.com a shorter time with simple implementation. In particular,
Farzad Kiani the performance of metaheuristic algorithms can be more
farzad.kiyani@gmail.com successful in solving complex problems [2]. In addition, no-
free-lunch (NFL) [4] asserts that there is no specific algo-
1
Software Engineering Department, Faculty of Engineering rithm providing the best solution for every optimization
and Natural Science, Istinye University, Istanbul, Turkey

13
Vol.:(0123456789)
Engineering with Computers

problem. As such, there is a considerable demand to develop phenomena [20], is another algorithm in the physics-based
new metaheuristic algorithms that can be used in various category. Other famous works in this category are charged
problems. Therefore, due to its wide area of use and advan- system search (CSS) [21], curved space optimization [22],
tages, in recent years, metaheuristic optimization algorithms central force optimization (CFO) [23], and galaxy-based
become popular in many fields of science [5–7]. In general, search algorithm (GbSA) [24]. In the methods in this cat-
the metaheuristic algorithms are inspired by biological or egory, search agents act randomly according to the assumed
physical phenomena [8] and try to find an optimum solution physical events.
(maximum or minimum) in a reasonable time. The swarm-intelligence (SI) is the third category of the
The metaheuristic algorithms are generally divided into metaheuristic algorithms. Generally, the SI methods are
two classes as single-solution and population-based (mul- inspired by the social behaviors of the animals that live
tiple) algorithms. In single solution algorithms, while a in swarms, held, and flocks in nature. In this kind of algo-
single solution influences the output, the entire population rithms, the search agents try to find an optimal solution by
is included in the population-based algorithm. In addition, influencing social intelligence. Particle Swarm Optimiza-
the metaheuristic algorithms are classified into evolution- tion (PSO) is the most famous algorithm in this category
ary, physics-based, and swarm intelligence algorithms. The [25]. The PSO has been inspired by the birds’ movement and
evolutionary algorithms (EAs) are inspired by natural evolu- social behaviors in nature. The particle indicates candidate
tion and the theory of Darwin evolution [9]. EAs are solving solutions, and the algorithm tries to find the best solution
a problem within a random search space and the EA is a from the search space based on these particles. Ant colony
population-based method, in which the whole population optimization (ACO) is another algorithm in this category
affects the optimum solution. The genetic algorithm (GA) [26]. ACO algorithm simulates the foraging behaviors of
is a well-known algorithm in the EA category [10], which ants. The main goal of an ant in nature is to find the destina-
is inspired by generation reproduction. The GA generates tion (food) and generate an optimal (secure and minimum
new generations by mimicking the crossover, mutation, and cost) path between source and destination. In the literature,
elitism to find global optima. Differential evolution (DE) this algorithm has been used to solve many types of prob-
algorithm is another algorithm that is motivated by natural lems [27, 28]. Artificial bee colony (ABC) [29] algorithm is
evolution. The DE algorithm has a difference from the GA in another important algorithm that mimics the social behavior
the selection operation to generate next generation [11]. The of honeybees. In the ABC discovering, a path is generated
improved quantum-inspired cooperative coevolution algo- between destination (nest) and source (food). Searching for a
rithm (MSQCCEA) [12] is one of the current studies. This rich food resource and bees experience are important phases
algorithm improves the global search capability and does not in the ABC algorithm to find the optimal solution. Other
fall into the local optimal trap. Accordingly, a quantum spin algorithms in this category are bat algorithm (BA) [30],
direction strategy has been developed to change the quan- firefly algorithm (FA) [31], grey wolf optimization (GWO)
tum evolution direction from one to two. It was applied to [32], and different variance [33], whale optimization algo-
knapsack and the actual airport gate allocation problems and rithm (WOA) [34], dragonfly algorithm (DA) [35], cuckoo
their results show the good performance of the introduced search (CS) [36], butterfly optimization algorithm (BOA)
algorithm on fast and accurate convergence rate. The Tabu [37], wind-driven optimization (WDO) [38], cat swarm
search (TS) algorithm is a different example of evolutionary optimization (CSO) [39], fruit fly optimization algorithm
algorithms [13]. Evolutionary programming (EP) is inspired (FFOA) [40], and flower pollination algorithm (FPA) [41],
by behavioral models such as phenotype, hereditary, and Salp swarm algorithm (SSA) [42], Pathfinder algorithm
variation [14]. The biogeography-based optimizer (BBO) (PFA) [43] and its improved version [3], and Harris hawks
[15] and black widow optimization (BWO) [16] are another optimization (HHO) Algorithm [44]. In addition, hybrid
example of EAs. metaheuristic algorithms have become widespread in recent
The physics-based algorithms are another family mem- years. Two of the valuable studies done in this context is
ber of metaheuristic methods. These algorithms are inspired presented in [45, 46]. In [45], authors present a novel method
by the physics rules in nature and act randomly according based on the DA and the WDO algorithms. The authors cor-
to the assumed physical events. In this kind of algorithms, rected the shortcomings of these two algorithms, and as a
the search space and optimal solution follow physical rules result, they proved their proposed method has fast conver-
such as electromagnetic force, gravitational force, and iner- gence speed and strong global search and is accurate in find-
tial force. There are famous optimization algorithms in this ing solutions. In [46], the authors were inspired by BOA and
category such as the gravitational local search (GLSA) FPA algorithms and proposed a flexible algorithm to solve
[17], the gravitational search algorithm (GSA) [18], and the global optimization problems. The authors suggested a
the Big Bang–Big crunch (BBBC) [19]. The Black Hole new mechanism for efficient switching between exploration
(BH) algorithm, which is inspired by the fact of black hole

13
Engineering with Computers

and exploitation phases, and thereby the algorithm exhibits results show that SCSO algorithm is more successful in
balanced behavior. solving complex optimization problems.
The main phases of metaheuristic algorithms are explora- 3. The proposed SCSO algorithm is able to escape from the
tion and exploitation, and each algorithm has specific strate- local optima trap and has a suitable and balanced behav-
gies to realize the concepts of these phases. The more bal- ior between the exploitation and exploration phases in
ance between these two phases in the proposed algorithms comparison with other investigated algorithms.
can have a higher success rate. In these strategies, the ran- 4. The SCSO has fewer parameters and operators compared
domness and the appropriate coefficients defined are impor- to other metaheuristic algorithms.
tant. It is very important to set the relevant coefficient param- 5. The SCSO is easier to implement than the other
eters properly to ensure a good balance between these two metaheuristic algorithms.
phases [33]. Exploration means search in the global. This
phase needs more search, so the relevant algorithm selects The rest of this paper is organized as follows. Section 2
randomly a solution(s) from the search space. Whereas the explains the sand cat behaviors in-depth and the mathemati-
exploitation phase comes after the exploration phase, and the cal model of the SCSO. Section 3 presents the simulation
exploitation phase focuses on the solution(s) on the search results and discussions of benchmark test functions. Sec-
space to improve the solution(s). In other words, unvisited tion 4 explains the results on seven engineering optimization
areas are checked whether they are potential escape areas problems. The conclusion and future works take place in the
from poor local optima. The randomness of the search in last section.
exploration affects the algorithm to avoid trapping on the
local optima to explore the global optimum.
In this paper, a new metaheuristic optimization algorithm 2 Sand cat swarm optimization (SCSO)
has been proposed which may be more suitable for solv-
ing various problems that behave balance between related This section describes the sand cats, which were the inspira-
phases according to the NFL theorem. Also, the proposed tion for the proposed algorithm first. Then, the mathematical
algorithm has a more favorable complexity than other cur- model and working mechanism of the proposed method are
rent metaheuristic algorithms. The proposed algorithm is presented.
inspired by the search and hunting behavior of the sand cat
and is named sand cat swarm optimization (SCSO). The 2.1 Inspiration: sand cats in nature
sand cats live alone in nature but are considered a herd
that can be identified by the user as a search agent in the Sand cat (Felis margarita) is one kind of the Felis animals
proposed algorithm. In the development of the proposed that it is a family of mammals. The sand cat lives in the
algorithm, the focus is on the low-frequency noise detec- harsh environment in sandy and stony deserts such as cen-
tion behavior of sand cats to find prey. These cats detect low tral Asia Sahara, African Sahara, and Arabian Peninsula.
frequencies below 2 kHz and thus they may catch prey at Small, nimble, and humble sand cat has different life behav-
long distances in a possible short time and with little action. ior in hunting and living. The living behavior of a sand cat is
Moreover, sand cats have an incredible ability to dig for unlike the domestic cat although, there is no big difference in
prey. Thanks to these two wonderful features, it is ensured appearance between a sand cat and a domestic cat. The sand
that sand cats have a special ability in searching and hunting cats do not live in a group like many felines. The sand cat’s
in nature. The proposed algorithm also behaves in balance palms and soles of the feet are covered with a higher density
in the exploration and exploitation phases with an adaptive of sandy to light grey fur. The fur coverage of the soles of
mechanism. Moreover, the key contributions of this paper feet insulates feet pads against the desert’s harsh hot and cold
are summarized below. weather. Furthermore, the fur characteristics of the sand cat
make detection and tracking indistinct and difficult. The sand
1. A new population-based optimization algorithm called cat's body length is about 45–57 cm. Its tail is about half of
SCSO mimics the searching and hunting behavior of the the head and body length (28–35 cm) with short legs beside
sand cats. the frontier claws that are short and sharply curved. Facing,
2. Various tests have been carried out on 20 well-known the back claws are more elongated and weakly curved. The
benchmark functions, and 10 modern benchmark func- adult sand cat weight ranges from 1 to 3.5 kg. The ears on
tions (CEC2019) for performance evaluation of the the sides of the head have 5–7 cm long. The ears of the sand
proposed algorithm, as well as seven case studies as cat have a considerable role in foraging. The nocturnally,
engineering optimization problems. Its results are com- subterranean, and secretive nature of this cat has made it
pared with various other well-known metaheuristics special [47, 48].
algorithms to evaluate its performance. Experimental

13
Engineering with Computers

Finding food for each animal is hard in a harsh and pun- domestic cats. Also, the middle ear cavities and bone chains
ishing environment. The sand cats have difficulties in the influence in increasing the acoustic input-admittance. Scien-
deserts also, but they benefit from the cool night to find food. tific studies show that the frequency emission in the sand cat
During the day, they rest in the underground and try to hunt for frequencies below 2 kHz is incredible. In this frequency
at night usually. These cats spend most of their time on their sand cats are about 8 dB more sensitive than domestic cats
burrows but in the posture they lie on their back to release [47, 48]. These unique characteristics can be reason that
internal heat. Besides, to overcome thirst they derive water the sand cat detects noise (prey movement), track prey, and
in food. They usually eat about 10% of their body weight, successful attacks based on prey location (Fig. 1). The sand
but they can also eat more than normal. The male sand cats cat also has a curious ability to rapidly dig if the prey is
walk more than the female at night, averaging 5.5 km and underground. According to the behaviors of the sand cat,
3.2 km, respectively for male and female cats. In the winter there are two stages in foraging: searching and attacking
season, the walking average value is lower than in the sum- the prey. In the algorithm proposed in this paper (SCSO),
mer season [49]. Like the Felis family, the sand cat uses the these two stages are emphasized. In addition, a mechanism is
paws for hunting. They are fine hunters and eat small desert suggested for the realization of exploration and exploitation
rodents, reptiles, small birds, spiny mice, insects, and they phases and to achieve the balance between them.
are proficient snake hunters.
The hunting mechanism in sand cats is quite interest- 2.2 Mathematical model and optimization
ing. They use their fantastic sense of hearing and get low- algorithm
frequency noises. In this way, the sand cats detect the prey
(insects and rodents) moving underground. There is no dif- The Sand Cat Swarm Optimization (SCSO) algorithm has
ference between sand and domestic cats in the ear’s pinna been inspired by the sand cat behaviors in nature. Two main
flange (outer ear). In the case of the middle ear, the ear canal actions of the sand cat are foraging the prey and attacking
length in sand cats is longer than in domestic cats, which the prey. The proposed algorithm is inspired by the special
causes a large volume of air space in the middle ear. Fur- feature of the sand cat, which is the ability to detect low-
thermore, the sand cat can detect the difference in time-of- frequency noises. Sand cats can locate prey benefits their
arrival between different sounds. The amount of acoustic extraordinary feature whether on the ground or underground.
input-admittance of the ear in cats is related to the tym- Thanks to this important feature, it can find and catch its
panic membrane, which is 5 times more in sand cats than in prey quickly. Since the sand cat lives alone in nature, in the

(a) living (b) searching

(c) hunting

Fig. 1  Sand cats in nature. a Living, b searching and c hunting

13
Engineering with Computers

Dimension prey. If a better solution is not found in the next iterations,


1 2 …. d the solution for that iteration is not unnecessarily stored in
1 x11 x12 x1d Cost1 memory and this ensures efficient use of memory. The pro-
cess shown in Fig. 2 takes place at every iteration.

Best-Cost
Sand Cat

2 x21 x22 x2d Cost2

2.2.2 Searching the prey (exploration)

….
….

….
n xn1 xn2 xnd Costn
In this section, the searching mechanism of the SCSO algo-
rithm is described. The prey search mechanism by sand cats
Sand Cati = [x1, x2,…., xd] ; i ℇ population (1,n)
relies on low frequency noise emission. ( The solution ) for
Fitness = (Sand Cat) = (x1, x2,…., xd) ; ∀xi (is calculated for n time)
each sand cat is represented as Xi = xi1 , xi2 , xi3 , … , xid . The
SCSO algorithm benefits from the hearing ability in low
frequencies detection of sand cats. In this way, the sensitivity
Fig. 2  Working mechanism of SCSO in initial and definition phase range for each cat is declared. As mentioned before, the sand
cat senses low-frequencies below 2 kHz. In mathematical
modeling, this value ( r���G⃗ ) will linearly decrease from two
proposed algorithm, the authors assumed sand cats as herds to zero as the iterations progress according to the working
to emphasize the concept of swarm intelligence. So, in the mechanism of the proposed algorithm to approach the hunt
algorithm initialization, the number of the sand cat can be it is looking for and not to lose or pass it (not to move away).
declared to optimize a minimization or maximization prob- Thus, to search for prey, it is assumed that the sand cat sen-
lem. For this, the first step is creating the initial population sitivity range starts from 2 kHz to 0 (Eq. on 1). Since the S ­M
and define the problem. value is inspired by the hearing characteristics of the sand
cats, its value is assumed to be 2. However, when solving
2.2.1 Initial population different problems, this value in determining the speed of
action of agents can be customized accordingly. This proves
In the solution of an optimization problem, the values of the flexibility and versatility of the equation presented. For
the relevant variables should be defined in accordance example, if the maximum number of iterations is 100, the
with the solution of the current problem. For example, in value of r���G⃗ will be greater than 1 in the first half iterations
PSO it is called a particle position [25] and in GWO it is and less than 1 in the last 50 iterations. It is worth men-
called a grey wolf position [32]. In our SCSO algorithm, tioning that the final and main parameter in controlling the
it is called sand cat and each cat shows the values of the transition between exploration and exploitation phases is R.
problem variables. In the proposed algorithm, which is a The R is a vector that it is obtained from Eq. 2. Thanks to
population-based method, the related structure is defined as this adaptive strategy, the transitions and possibilities in the
a vector. In a d dimensional optimization problem, a sand two phases will be more balanced.
cat is a 1 × d array representing the solution to the problem The search space is randomly initialed between the
and is defined as is shown in Fig. 2. Each of the variable defined boundaries. In the searching step, position updating
values (x1, x2, …, xd) is a floating-point number. Here every of each current search agent is based on a random position.
x must be located between the lower and upper boundaries In this way, the search agents able to explore new spaces in
(∀xi ∈ [lower, upper]). To start the SCSO algorithm, first, the search space. The sensitivity range for each sand cat is
a candidate matrix is created with the sand cat population different, to avoid the local optimum trap and it is realized
according to the size of the problem (Npop × Nd), (pop = 1, by Eq. 3. Therefore, the r���G⃗ indicates the general sensitiv-
…, n). ity range that is decreased linearly from 2 to 0. Besides, ⃗r
In addition, the fitness cost of each sand cat is obtained by demonstrates sensitivity range of each cat. The ⃗r is used for
evaluation of defined fitness function. This function defines operations in exploration or exploitation phases while r���G⃗ .
the relevant parameters of the problem, and the best values guides R parameter for transitions control in these phases.
of the parameters (variables) will be obtained by the SCSO. In addition, ­iterc is current iteration and ­iterMax is maximum
A value for the corresponding function will be output from iterations.
each sand cat. When an iteration is finished, the sand cat ( )
with the best cost in that iteration is chosen so far, the best 2 × SM × iterc
r���G⃗ = sM − , (1)
solution (if there was no answer as good as this in the previ- iterMax + itermax
ous iterations) and the other sand cats try to move towards
this best-chosen cat in the next iteration. Because the best ⃗ = 2 × r���G⃗ × rand(0, 1) − r���G⃗,
R (2)
solution in each iteration can represent the cat closest to the

13
Engineering with Computers

⃗r = r���G⃗ × rand(0, 1). (3) the algorithm to be low in operation cost and efficient in
complexity.
Each search agent (sand cat) updates its own position ( )
based on(the best-candidate position ( Pos
��������bc⃗) and its current ������⃗(t + 1) = ⃗r ⋅ Pos
Pos ��������bc⃗(t) − rand(0, 1) ⋅ Pos
�������⃗c (t) . (4)
)
�������⃗c and its sensitivity range ( ⃗r ). Therefore, the
position Pos
sand cats able to find the possible other best prey position
(Eq. 4). This equation gives another chance to the algorithm 2.2.3 Attacking on the prey (exploitation)
to find new local optima in the search area. Hence, the
obtained position is located between the current position and As mentioned before, the sand cats detect the prey based
the position of the prey. Moreover, this is achieved by ran- on their ears' ability. To mathematically model the attack-
domness, not by exact methods. In this way, the search ing phase of SCSO, the distance between the best position
agents in the algorithm benefit the randomness. This causes ( Pos
�������⃗b ) (best solution) and the current position ( Pos
�������⃗c ) of the

r
Ѳ
Ѳ R
R
r

Prey

Ѳ
r
r
R
R

(a) Position updating in iterationi

Ѳ
r r
Ѳ
R

Prey

Ѳ Ѳ
r
R r
R

(b) Position updating in iterationi+1

Fig. 3  Position updating mechanism in SCSO. a ­Iterationi, b ­iterationi+1

13
Engineering with Computers

sand cat is calculated by Eq. 5. Furthermore, the sand cat sen- SCSO to switch seamlessly between two phases. Since the
sitivity range is supposed as a circle, in this way, the direction R parameter depends on rG, its fluctuation range will be
of movement is determined by a random angle (𝜃) on the cir- also decreased. As stated before, when the values of the rG
cle. Of course, in specifying the direction of movement other parameter are distributed in a balanced way, the R value will
parameters are also affective that is declared in Eq. 5. Since also be well-balanced and therefore the chances of opera-
the chosen random angle is between 0 and 360, its value will tions between the two phases will be appropriate accord-
be between − 1 and 1. In this way, each member of popula- ing to the problem. In other words, R is a random value in
tion is able to move in a different circular direction of the the interval [− 2rG, 2rG] where rG is reduced from 2 to 0
search space. The SCSO benefits the Roulette Wheel selec- over iterations linearly. When the random values of R are in
tion algorithm to select a random angle for each sand cat. [− 1, 1], the next position of a sand cat can be at any posi-
In this way, the sand cat can approach the hunting position. tion between its current position and the hunt position. The
The random angle is also used to avoid the local optimum SCSO algorithm forces the search agents to exploit when
trap. Using the random angle in Eq. 5 will have a positive the R is lower or equal to 1, otherwise, the search agents
effect on the approach of the agents to the hunt and guides forced to explore and find prey. In searching the prey phase
them. ­Posrnd indicates the random position and ensures that (exploration), the different radius of each cat avoids the local
the cats involved can be close to prey. Position updating in optimum trap. This feature is one of the effective parameters
SCSO in two consecutive iterations is shown in Fig. 3. As the in attacking the prey (exploitation) too. Equation 6 shows
iterations progress, as it seems, the cats approach the hunt. In the position update for each sand cat in the exploration and
addition, the working example of SCSO from the beginning exploitation phases. As shown in Fig. 5, when |R| ≤ 1, sand
to the last iteration is shown representational in Fig. 4. cats are directed to attack their prey, otherwise, the cats are
tasked with finding a new possible solution in the global
Pos ⃗ |
��������� �������⃗
rnd = ||rand(0, 1) ⋅ Posb (t) − Posc (t)||,
�������⃗ | area. The balanced behavior of the proposed algorithm and
(5) the effort to find other possible local areas in the global
�����⃗ + 1) = Pos
Pos(t ���������
�������⃗b (t) − ⃗r ⋅ Pos ⃗
rnd ⋅ cos (𝜃).
space has a fast and accurate convergence rate, and there-
fore it is useful to perform well in high-dimensional and
multi-objective problems. The pseudocode and flowchart of
2.2.4 Exploration and exploitation proposed algorithm (SCSO) are given in Algorithm 1 and
Fig. 6, respectively. The open-source codes of SCSO algo-
Exploration and exploitation are guaranteed by the adap- rithm are available at https://​www.​mathw​orks.​com/​matla​
tive values of rG and R parameters. These parameters allow bcent​ral/​f ilee​xchan​ge/​87659-​sand-​cat-​swarm-​optim​izati​
on-​algor​ithm
{
�������⃗ ����������⃗
⃗ + 1) =
X(t ( Posb (t) − Posrnd ⋅ cos (𝜃).⃗r ) |R| ≤ 1; exploitation (6)
��������bc⃗(t) − rand(0, 1) ⋅ Pos
⃗r ⋅ Pos �������⃗c (t) |R| > 1; exploration

13
Engineering with Computers

Prey Prey

(a) Performance in beginning iteration (b) Performance in the first iterations

Prey

(c) Performance in towards the last iterations (d) Performance in the last iteration

Fig. 4  The working example of SCSO from beginning to last iteration

2.2.5 Computational complexity obtained and the analysis performed. These functions that
are used in this paper are divided into three groups: uni-
The computational complexity of calculation of the con- modal, multimodal, and fixed-dimension multimodal.
trol parameters defined is O (n × m), where n represents These functions are employed to examine the efficiency of
the population and m represents the size of the problem. the SCSO algorithm. The unimodal benchmark functions
Also, in the initialization phase, it requires O (n × m) time. just have one global optimum (maximum or minimum) and
Besides, the computational complexity of the agents' posi- no local optima. Multimodal functions have more than one
tion update is also O (n × m). Considering that n and m
equal, the general computational complexity of the pro-
posed algorithm is O(n2). In other words, the computational
complexity of the SCSO algorithm is O(n2); O (n × m); if
n = m. Indeed; T(n) = (n�+ 1)� + (m + 1) + c(n × m) + O(1) →
∑n ∑m
O(1) + I=1 j=1 c ≅ O n2

t+1 t+1 t+1


3 Results and discussion
t ≤1
( )
This section presents the performance evaluation of the t
SCSO algorithm on 30 test benchmark functions. Its results t+1 t+1
are compared with seven well-known algorithms (BWO >1 t+1
[16], GSA [18], PSO [25], GWO [32], WOA [34], CSO
( )
[39], and SSA [42]). These benchmark functions are chosen
from CEC2014 [50], CEC2015 [51], and CEC2019 [52].
They are the classical functions utilized by many research-
ers [32, 53–56]. The variety of the selected functions has Fig. 5  Attacking prey (exploitation) versus searching for prey (explo-
been increased to improve the accuracy of both the results ration)

13
Engineering with Computers

dimension of each benchmark function is static, and it is


not able to be tuned the dimension number, whereas, in the
multimodal function, it can be tuned.
The simulation and analysis presentation has been per-
formed using MATLAB on the Core i7-5500 U 2.4 proces-
sor with 8 GB of RAM computer. Table 1 presents the simu-
lation parameters of each algorithm. In addition, Tables 2
and 6 presents the detailed information of each benchmark
function that dim is indicating the dimension; the range
is the boundary limit of each function in the search space
between lower bound and upper bound. fmin is the global
optimum value. To evaluate the performance of the SCSO
algorithm with the other optimization algorithms the pop-
ulation size and maximum iteration number were consid-
ered 30 and 500 respectively, besides the average value of
each objective function obtained from 30 independent runs
for each algorithm. Furthermore, the 2D surface of some
benchmarks used is presented in Fig. 7. In the second stage
of performance evaluation and analysis, seven engineering
optimization problems are addressed. Results for this are
presented in Sect. 4.

3.1 Exploration and exploitation analysis

The performance of the unimodal benchmark functions is


given in Table 3. In this category of functions, the SCSO
algorithm found the optimal solution in the functions ­F1,
­F2, ­F3, ­F4, and F
­ 7. The GWO algorithm was best in the F ­5
Fig. 6  The flowchart of the sand cat swarm optimization algorithm and the GSA algorithm achieved the best result in ­F6. In
this category of benchmark functions, since there is only
one global optimum, the performance of the relevant algo-
local optimum and just have one global optimum. This type rithms is evaluated during the exploitation phase. Based on
of benchmark function is helpful to evaluate optimization the results, the performance of the SCSO algorithm is seen
algorithms performance in the exploration and exploitation to be superior in the related functions, so it is observed to
phases. In the fixed-dimension multimodal function, the be fast and successful in the exploitation phase. In Tables 4

Table 1  Simulation parameter for each optimization algorithms


Algorithm Parameter Value Algorithm Parameter Value

PSO Maximum weight 0.9 SCSO Sensitivity range (rG) [2,0]


Phases control range (R) [− 2rG, 2rG]
Minimum weight 0.4 WOA a [2,0]
c1 and c2 acceleration constants 2 A [2, 0]
BWO Percentage of crossover 0.8 C 2.rand (0,1)
Mutation rate 0.4 0.4 l [− 1,1]
Cannibalism rate 0.5 b 1
CSO Mixture ratio 0.75 GWO a [2,0]
Seeking memory pool 5 A [2, 0]
Seeking range of the selected dimension 0.25 C 2.rand (0,1)
Counts of dimension to change 0.65 SSA Initial speed (v0) 0
GSA Alpha 20
Gravitational constants G0 100

13
Table 2  Benchmark functions (CEC14, 15)
Benchmark function Formula Dim Range f min Type

13
∑n
F1 Sphere f1 (x) = xi2 30 [− 100,100] 0 Unimodal
∑i=1
n � � ∏n
F2 Schwefel 2.22 f2 (x) = i=1 �xi � + i=1 ��xi �� 30 [− 10,10] 0 Unimodal
F3 Schwefel 1.2 �
∑n−1 ∑j<i �2 30 [− 100,100] 0 Unimodal
f3 (x) = i=0 x
j=0 i
{ }
F4 Schwefel 2.21 f4 (x) = maxi |xi , 1 ≤ i ≤ n 30 [− 100,100] 0 Unimodal
∑ � � �2 � �2 �
F5 Generalized Rosen- f5 (x) = n−1 30 [− 30,30] 0 Unimodal
i=1
100 xi+1 − xi2 + xi − 1
brock
F6 STEP ∑n �� ��2 30 [− 100,100] 0 Unimodal
f6 (x) = i=1 xi + 0.5
∑n
F7 Quartic f7 (x) = i=1 ixi4 + random[0, 1) 30 [− 1.28,1.28] 0 Unimodal
�� �
F8 Generalized Schwefel ∑ �xi � 30 [− 500,500] − 418.9829*5 Multimodal
f8 (x) = ni=1 −xisin � �
∑n � � � �
F9 Rastrigin f9 (x) = i=1
xi2 − 10𝑐𝑜𝑠 2𝜋xi + 10 30 [− 5.12, 5.12] 0 Multimodal
� �
n
F10 Ackley � ∑ 30 [− 32,32] 0 Multimodal
f10 (x) = −20exp −0.2 1n i=1 xi2
� ∑ � ��
−exp 1n ni=1 cos 2𝜋xi + 20 + e
� �
F11 Griewank 1 ∑n ∏ X 30 [− 600,600] 0 Multimodal
f11 (x) = 4000 x2 − ni=1 cos √i + 1
i=1 i i

F12 Generalized penalized f12 (x) = 𝜋 � � ∑ � �2 � � �� � �2 �
30 [− 50,50] 0 Multimodal
n
10sin 𝜋y1 + n−1
i=1 yi − 1 1 + 10sin2 𝜋y1+1 + yn − 1
∑n � �
+ i=1 u xi , 10, 100, 4
xi +1
yi = 1 + 4
⎧ � �m
� � ⎪ k xi − a xi > a
m
u xi , a, k, m = ⎨ � 0 � −a < xi < 1
⎪ k −xi − a xi < −a

� � � ∑ � �2 � � ��
F13 Generalized penalized f13 (x) = 0.1 sin2 3𝜋x1 + ni=1 xi − 1 1 + sin2 3𝜋xi + 1 30 [− 50,50] 0 Multimodal
� �2 � � ��� ∑n � �
+ xn − 1 1 + sin2 2𝜋xni + i=1 u xi , 5, 100, 4

F14 Shekel’s foxholes � �−1 2 [− 65,65] 1 Fixed-dimension


1 ∑25 1
f14 (x) = 500
+ j=1 j+ ∑2 6
i=1 (xi −aij )

F15 Kowalik’s ∑11 � �


x (b2 +bi x2 ) 2 4 [− 5,5] 0.00030 Fixed-dimension
f15 (x) = i=1 i
a − b12 +bi x +x
i i 3 4

F16 Six-Hump Camel- 6 2 [− 5,5] − 1.0316 Fixed-dimension


f16 (x) = 4x21 − 2.1x41 + 13 x + x1 x2 − 4x22 + 4x42
back 1
Engineering with Computers
Engineering with Computers

and 5, the results of the multimodal and fixed-dimension

Fixed-dimension

Fixed-dimension

Fixed-dimension

Fixed-dimension
multimodal functions are presented. The multimodal func-
tions have multiple local optima and one global optimum,
besides the local optima number can be increased with the
Type

dimension of problem size. These functions can be a very


important roadmap for evaluating exploration and exploita-
tion phases. Table 4 presents the results of each algorithm
over ­F8 to ­F13 functions. The SCSO algorithm gets the best
solution in the ­F8, ­F10, and ­F11. The WOA found the better
− 3.86

− 3.32 solution in F9, while the BWO algorithm achieved better


0.398
f min

results in F
­ 12 and ­F13 functions. According to the results of
3

the multimodal test functions, it has been determined that


the SCSO algorithm has a better performance compared to
the others. Besides, Table 5 reports the simulation results
for the functions ­F14 to ­F20. The fixed-dimension multimodal
[− 5,5]

[− 2,2]

function is evaluating the exploration in the optimization


Range

[0, 1]
[1,3]

algorithm. The SCSO algorithm found the best solution for


the functions ­F15 to ­F19. In this category, SCSO is the best
optimizer compared to others. That indicates the SCSO algo-
rithm's efficient performance in the exploration phases. In
Dim

general, considering the performance of all algorithms over


2

30 benchmark test functions, the SCSO algorithm achieved


the best results and performed the best in 70% of the func-
)]
f18 (x) = 1 + x1 + x2 + 1 × 19 − 14x1 + 3x21 − 14x2 + 6x1 x2 + 3x22

tions. The second-best algorithm was followed by BWO with


)]
. 30 + 2x1 − 3x2 × 18 − 32x1 + 12x21 + 48x2 − 36x1 x2 + 27x22

a 30% rate (it found the best answer in six of the 20 func-
tions.). The working mechanism of the SCSO algorithm is
cosx1 + 10

presented in Fig. 8 as an example.


Alongside these 20 test functions, the 100-digit chal-
lenge also known as “CEC-C06 benchmark test functions”
)

is examined in this study [52]. Information about these func-


x + 𝜋5 x1 − 6 + 10 1 − 8𝜋 1

tions is presented in Table 6. In addition, the results are


�2 �

�2 �

compared with the other 7 algorithms. Population number,


(

f19 (x) = − 4i=1 ci exp − 3j=1 aij xj − pij

f20 (x) = − 4i=1 ci exp − 6j=1 aij xj − pij

number of iterations, and independent run times are assumed


)2

to be 30, 500, and 30. The results are presented in Table 7.



)2 (

When the results are analyzed, it is observed that SCSO per-


� ∑

� ∑

forms better than other algorithms. The SCSO found the bet-
)2 (

ter solution in these 10 functions in CEC01, CEC02, CEC03,


5.1 2
2 1

CEC08, and CEC09 among others. It can be noticed from


f17 (x) = x2 − 4𝜋

the results that the SCSO algorithm is a competitive algo-


(

rithm for the modern ones and provides satisfying results


(

(
Formula

compared to others. In the rest of the CEC–CO6 functions


results, the GWO was better in CEC03, PSO in CEC04, GSA
[

in CEC06, BWO in CEC10, and finally, SSA in CEC05 and


CEC07. As seen, based on the results of the CEC–C06 2019
Benchmark function

benchmark functions is understood that the SCSO algorithm


Hartman’s family

Hartman’s family
Goldstein price

can be used to solve these competition optimization prob-


lems than others.
Branin
Table 2  (continued)

3.2 Local optima and convergence curve

In this section, the performance of the SCSO algorithm is


analyzed with the parameters local optimum avoidance and
F17

F18

F19

F20

convergence curve this time. The SCSO algorithm starts

13
Engineering with Computers

Fig. 7  Typical 2D representations of some benchmark functions [32]

Table 3  Simulation result for each algorithm for F1–F7


F SCSO CSO GWO WOA
MEAN STD MEAN STD MEAN STD MEAN STD

1 2.42E−97 7.22E−97 4.93E−11 9.26E−11 1.08E−27 1.82E−27 3.26E−71 1.78E−70


2 1.16E−52 2.55E−52 1.17E−06 7.85E−07 8.94E−17 7.23E−17 9.39E−52 3.56E−51
3 7.84E−81 3.49E−80 7.86E−03 1.62E−02 1.49E−09 3.52E−09 4.04E+04 1.46E+04
4 4.57E−46 9.98E−46 1.21E−01 1.01E−01 1.03E−07 1.24E−06 4.60E+01 2.90E+01
5 2.80E+01 8.73E−01 2.73E+01 5.96E−01 2.69E+01 6.23E−01 2.79E+01 3.84E−01
6 2.15E+00 4.47E−01 6.15E−01 3.73E−01 6.75E−01 4.31E−01 4.88E−01 3.17E−01
7 1.51E−04 1.33E−04 6.89E−02 3.41E−02 1.82E−03 7.27E−04 4.34E−03 4.14E−03
F SSA GSA PSO BWO
MEAN STD MEAN STD MEAN STD MEAN STD

1 3.08E−07 7.18E−07 2.50E−16 1.00E−16 7.04E−03 6.76E−03 1.55E−05 5.63E−05


2 1.71E+00 1.39E+00 1.85E−02 9.34E−02 2.69E−02 5.82E−02 1.06E−03 4.01E−03
3 1.47E+03 8.31E+02 9.41E+02 3.69E+02 2.07E+02 2.06E+02 1.27E+03 5.43E+02
4 1.11E+01 3.93E+00 6.61E+00 2.09E+00 7.49E+00 1.49E+00 1.78E+01 7.45E+00
5 2.47E+02 5.22E+02 6.52E+01 8.18E+01 9.42E+01 7.64E+01 8.57E+03 1.50E+04
6 1.75E−07 3.53E−07 2.85E−16 2.38E−16 3.74E−02 1.60E−01 1.60E−04 7.48E−04
7 1.59E−01 5.53E−02 8.57E−02 3.58E−02 5.70E−02 1.56E−02 5.33E−03 5.49E−03
The best values of alglorithms are written in bold

13
Engineering with Computers

from a large sensitivity range to discover more possible solu- In addition to the analysis made, Table 8 presents the
tions and exploring the whole search area. As the iterations p-values calculated by the nonparametric Wilcoxon rank-
progress, by decreasing the value of the sensitivity ranges sum tests for the pair-wise comparison over two independent
search agents try to exploit and find the global optima. The samples (SCSO vs. CSO, GWO, WOA, SSA, GSA, PSO,
local optima avoidance in the SCSO algorithm is controlled BWO). The p-values are generated by the Wilcoxon test
by the R parameter. It is realized with a balance between with 0.05 significant level and over 30 independent runs.
exploration and exploitation phases. Figure 9 shows the con- In this table, plus notation (+) indicates the superiority of
vergence curve analysis of the SCSO algorithm in contrast the proposed algorithm, ~ indicates both algorithms obtain
to other algorithms used on some benchmark functions. In equal values, and mines notation (−) indicates the obtained
convergence behavior analysis focuses on the exploration solution of the proposed algorithm is worse than compared
and exploitation behavior of each algorithm. The SCSO algorithms.
algorithm has a good performance at the exploration phase
due to the mechanism used in the location updating of the
search agent. In addition, thanks to the attacking mechanism 4 SCSO in classical engineering problems
of the SCSO algorithm, it was possible for search agents to
exploit efficiently. In this section, some engineering problems, which have
Berg et al. [57] have mentioned that the unexpected some equality and inequality limitations, are applied by the
change in the movement of search agents of the initial steps SCSO and their results are compared with other algorithms.
of the optimization algorithm is required. These movement These problems are seven well-known constrained engineer-
changes give rise to explore the search space broadly, in the ing design problems [Welded beam design (WBD), com-
last steps, this movement should be decreased to emphasize pression spring design (CSD), pressure vessel design (PVD),
exploitation. Generally, the abrupt change in the movement piston lever problem (PL), speed reducer design (SRD),
of the search agents is caused the search agent to discover a three-bar truss design (BTD), and cantilever beam design
possible solution and exploit from them. It was observed that (CBD)]. Metaheuristic algorithms can be usually successful
the SCSO algorithm has a good performance in convergence in solving such problems. The SCSO algorithm is applied to
behavior. Regarding results according to the mechanism of each problem and the results are compared with other used
the SCSO algorithm, search agents discover areas in the metaheuristic algorithms (CSO, GWO, WOA, SSA, GSA,
early iterations and then try to exploit them after certain PSO, and BWO). The simulation assumptions are assumed
of the iterations. Herewith, the performance of the SCSO to be the same as in the previous section.
algorithm on the benchmark functions better than other used
metaheuristic algorithms.

Table 4  Simulation result for each algorithm for F8–F13


F SCSO CSO GWO WOA
MEAN STD MEAN STD MEAN STD MEAN STD

8 − 1.01E+04 1.70E+03 − 6.17E+03 6.94E+02 − 5.96E+03 8.81E+02 − 6.98E+03 7.03E+02


9 0.00E+00 0.00E+00 1.28E+02 2.82E+01 2.71E+00 3.84E+00 0.00E+00 0.00E+00
10 8.77E−16 0.00E+00 8.69E−02 4.76E−01 1.04E−13 1.67E−14 4.09E−15 2.35E−15
11 0.00E+00 0.00E+00 9.92E−03 2.41E−02 5.45E−03 8.69E−03 1.49E−02 4.65E−02
12 1.25E−01 5.41E−02 1.47E+00 1.86E+00 5.99E−02 3.54E−02 2.89E−02 2.75E−02
13 1.99E+00 2.51E−01 3.10E+00 9.95E−01 6.64E−01 2.39E−01 5.66E−01 2.72E−01
F SSA GSA PSO BWO
MEAN STD MEAN STD MEAN STD MEAN STD

8 − 7.39E+03 7.32E+02 − 2.55E+03 4.43E+02 − 8.51E+03 7.38E+02 − 1.06E+04 1.20E+03


9 5.28E+01 2.08E+01 2.72E+01 8.01E+00 5.97E+01 1.76E+01 2.12E−04 7.06E−04
10 2.46E+00 6.88E−01 3.10E−02 1.70E−01 7.82E−01 7.06E−01 1.46E−03 4.19E−03
11 1.49E−02 1.22E−02 2.90E+01 7.26E+00 2.90E−02 2.93E−02 1.40E−01 2.15E−01
12 7.34E+00 2.59E+00 1.59E+00 7.24E−01 1.43E−01 2.41E−01 3.65E−03 1.99E−02
13 1.68E+01 1.74E+01 9.06E−01 6.80E−01 2.53E−02 3.87E−02 9.32E−03 4.19E−02

The best values of algorithms are written in bold

13
Engineering with Computers

Table 5  Simulation result for each algorithm for F14–F20


F SCSO CSO GWO WOA
MEAN STD MEAN STD MEAN STD MEAN STD

14 1.73E+00 9.00E−01 9.98E−01 2.18E−09 3.19E+00 3.33E+00 3.48E+00 3.49E+00


15 4.82E−04 3.22E−04 6.17E−04 5.19E−04 3.16E−03 6.87E−03 8.07E−04 6.06E−04
16 − 1.0316 1.15E−09 − 1.0316 4.76E−07 − 1.0316 2.40E−08 − 1.0316 1.99E−09
17 0.398 1.55E−07 0.398 2.42E−07 0.398 3.61E−06 0.398 2.96E−05
18 3 1.67E−05 3 9.98E−05 3 8.14E−05 3 1.59E−04
19 − 3.86 1.49E−03 − 3.86 3.55E−04 − 3.86 2.58E−03 − 3.86 4.04E−03
20 − 3.25 9.95E−02 − 3.32 2.18E−02 − 3.25 8.07E−02 − 3.23 1.41E−01
F SSA GSA PSO BWO
MEAN STD MEAN STD MEAN STD MEAN STD

14 1.23E+00 5.01E−01 5.61E+00 3.76E+00 9.54E+00 1.09E+00 5.69E+00 3.92E+00


15 2.94E−03 5.92E−03 4.26E−03 2.43E−03 2.02E−03 5.00E−03 1.79E−03 1.38E−03
16 − 1.0316 4.45E−14 − 1.0316 4.79E−16 − 1.0316 6.71E−16 − 1.0316 8.68E−03
17 0.398 0.00E+00 0.398 0 0.398 0.00E+00 0.398 0.0000
18 3 3.48E−13 3 3.54E−15 3 1.78E−15 3 6.40E+00
19 − 3.86 2.47E−11 − 3.86 2.37E−15 − 3.86 2.61E−15 − 3.86 1.17E−02
20 − 3.22 5.62E−02 − 3.32 1.54E−15 − 3.29 5.11E−02 − 3.21 1.17E−01

The best values of algorithms are written in bold

4.1 Welded beam design problem 0.1 ≤ x2 ≤ 10.0,

In this problem, the objective is to design a welded beam


0.1 ≤ x3 ≤ 10.0,
with minimum production cost [32, 58]. There are some
optimization constraints such as shear stress (τ), buckling
load on the bar (Pc) end deflection of the beam (δ), and bend- 0.1 ≤ x4 ≤ 2.00
ing stress in the beam (Ѳ). The four variables of this problem
The obtained results, which are presented in Table 9,
are the thickness of weld (h), length of the attached part to
show that the SCSO algorithm finds a minimum cost design
bar (l), the height of the bar (t), and thickness of the bar (b)
value in comparison to other metaheuristic algorithms.
(see Fig. 10). This problem is formulated mathematically
as follows.
4.2 Tension/compression spring design problem
[ ]
Consider: x⃗ = x1 x2 x3 x4 = [h l t b],
The second engineering optimization problem we investi-
( ) gated is the Tension/compression spring design problem.
Minimize f (⃗x) = 1.10471x12 x2 + 0.04811x3 x4 14.0 + x2 , As shown in Fig. 11, the main goal of this problem to mini-
mize the weight of a tension/compression spring design [15,
subject to g1 (⃗x) = 𝜏(⃗x) − 𝜏max ≤ 0, 58]. The variables of this problem are wire diameter (d),
mean coil diameter (D), and the number of active coils (N).
Surge frequency, minimum deflection, and shear stress are
g2( x⃗) = 𝜎( x⃗) − 𝜎max ≤ 0, the three constraints of the problem. Definitions related to
g3( x⃗) = 𝛿( x⃗) − 𝛿max ≤ 0, this problem are given mathematically as below. Moreover,
g4( x⃗) = x1 − x4( ≤) 0, Table 10 presents the obtained results based on the SCSO
g5( x⃗) = p − pc x⃗ ≤ 0, and all compared algorithms. Based on the results obtained,
g6( x⃗) = 0.125 − x1 ≤ 0, ( ) the performance of the SCSO algorithm is better than other
g7(x⃗) = 1.10471x12 x2 + 0.04811x3 x4 14.0 + x2 − 5.0 ≤ 0 used metaheuristic algorithms at finding the optimum cost.
Variable range: 0.1 ≤ x1 ≤ 2.00, [ ]
Consider: x⃗ = x1 x2 x3 = [dDN],

13
Engineering with Computers

Fig. 8  The working mechanism of the SCSO algorithm in defined iterations (sample case: Ackley function ­F10)

Table 6  Modern 10 benchmark Function Benchmark function Dim Range fmin


test functions from CEC2019
(CEC–C06) CEC01 Storn’s Chebyshev polynomial fitting problem 9 [− 8192,8192] 1
CEC02 Inverse Hilbert matrix problem 16 [− 16384, 16384] 1
CEC03 Lennard–Jones minimum energy cluster 8 [− 4,4] 1
CEC04 Rastrigin’s function 10 [− 100,100] 1
CEC05 Grıenwank’s function 10 [− 100,100] 1
CEC06 Weiersrass function 10 [− 100,100] 1
CEC07 Modified Schwefel’s function 10 [− 100,100] 1
CEC08 Expanded Schaffer’s F6 function 10 [− 100,100] 1
CEC09 Happy cat function 10 [− 100,100] 1
CEC10 Ackley function 10 [− 100,100] 1

13
Engineering with Computers

Table 7  Simulation result for each algorithm for CEC–C06


F SCSO CSO GWO WOA
MEAN STD MEAN STD MEAN STD MEAN STD

CEC01 1.04E+05 7.10E−10 1.58E+09 1.71E+09 1.53E+06 7.10E−10 1.53E+7 2.44E+7


CEC02 1.70E+01 3.61E−15 1.97E+01 5.81E−01 1.98E+01 3.61E−15 1.74E+01 5.50E−02
CEC03 1.27E+01 0.00E+00 1.37E+01 2.35E−06 1.27E+01 0.00E+00 1.27E+01 6.78E−07
CEC04 7.40E+02 7.83E+01 1.79E+02 5.53E+01 7.41E+02 2.96E+00 4.19E+02 2.39E+02
CEC05 8.79E+00 2.60E−01 5.67E+00 1.72E−01 8.79E+00 7.23E−04 1.74E+00 2.53E−01
CEC06 2.18E+00 1.05E+00 1.12E+01 7.08E−01 3.77E+00 1.08E−14 8.97E+00 1.29E+00
CEC07 4.54E+02 4.37E+02 3.65E+02 1.64E+02 5.61E+02 9.21E+00 5.52E+02 2.77E+02
CEC08 5.04E+00 3.40E−01 5.50E+00 4.85E−01 5.05E+00 1.87E−01 5.77E+00 5.82E−01
CEC09 2.36E+00 2.39E−02 6.32E+00 1.29E+00 6.45E+00 2.77E−02 4.91E+00 6.62E−01
CEC10 2.02E+01 1.49E−01 2.13E+00 6.08E−02 2.14E+01 7.23E−15 2.02E+01 1.09E−01
F SSA GSA PSO BWO
MEAN STD MEAN STD MEAN STD MEAN STD

CEC01 1.07E+05 1.20E+05 4.58E+12 2.93E+12 1.47E+09 1.44E+09 3.09E+09 4.61E+09


CEC02 1.73E+01 9.70E−03 2.01E+02 6.60E+01 1.73E+01 3.72E+01 1.87E+01 3.44E+00
CEC03 1.27E+01 7.06E−04 1.27E+01 3.61E−15 1.27E+01 3.61E−15 1.27E+01 1.11E−06
CEC04 4.53E+01 2.34E+01 3.41E+02 1.86E+02 1.71E+01 8.01E+00 1.30E+03 6.15E+02
CEC05 1.23E+00 1.31E−01 1.26E+00 1.47E−01 1.33E+00 3.02E−02 6.97E+00 5.77E−02
CEC06 5.64E+00 1.86E+00 1.01E+00 4.89E−05 9.31E+00 1.05E+00 5.64E+00 1.07E+00
CEC07 2.91E+02 2.87E+02 4.58E+02 2.54E+00 3.21E+02 3.02E+01 5.98E+02 4.02E+00
CEC08 5.23E+00 7.73E−01 4.19E+00 9.01E−01 5.23E+00 7.61E−01 5.71E+00 1.62E−02
CEC09 2.65E+00 1.67E−01 2.47E+00 2.36E−02 2.41E+00 4.12E−02 8.91E+01 1.18E+02
CEC10 2.05E+01 1.01E−01 1.98E+01 1.37E−02 2.07E+01 1.09E−01 1.96E+01 1.11E−01

The best values of algorithms are written in bold

( )
Minimize f (⃗x) = x3 + 2 x2 x12 , 4.3 Pressure vessel design problem

In this study, the third optimization problem discussed is the


x23 x3
subject tog1 (⃗x) = 1 − ≤ 0, pressure vessel design problem. The objective of this prob-
71785x14 lem to minimize the total cost of a cylinder-shaped pressure
vessel including material, welding, and forming [58, 59].
4x22 −x1 x2
The head of the vessel has an hemispherical shape, and both
g2( x⃗) = 12566(x2 x13 −x14 )
+ 1
5108x12
− 1 ≤ 0, ends of the vessel are capped. The problem variables are
the thickness of the head (Th), the length of the cylindrical
g3( x⃗) =
140.45x
1 − x2 x 1 ≤ 0,
2 3 section without considering the head (L), the thickness of
g4( x⃗) =
x1 +x2
1.5
−1≤0 the shell (Ts), and the inner radius (R), as shown in Fig. 12.
Variable range: 0.05 ≤ x1 ≤ 2.00, In addition, this problem has four constraints that are repre-
sented by mathematical expressions as follows. According to
the simulation results, the proposed algorithm outperformed
0.25 ≤ x2 ≤ 1.30,
better than other used algorithms in finding the optimum
cost. Their results are presented in Table 11.
2.00 ≤ x3 ≤ 15.0 [ ] [ ]
Consider: x⃗ = x1 x2 x3 x4 = Ts Th RL ,

Minimize f (⃗x) =0.6224x1 x3 x4 + 1.7781x2 x32


+ 3.1661x12 x4 + 19.84x12 x3

13
Engineering with Computers

subject to g1 (⃗x) = −x1 + 0.0193x3 ≤ 0, 4.5 Speed reducer design problem

This problem (Fig. 14) is a famous design problem in


g2 (⃗x) = −x2 + 0.00954x3 ≤ 0,
mechanical systems. In this case, one of the essential parts
of the gearbox is the speed reducer, and it can be employed
4
g3 (⃗x) = −𝜋x32 x4 − 𝜋x33 + 1296000 ≤ 0, for several applications. In this optimization problem, the
3 weight of the speed reducer depends on 11 constraints,
which must be minimized [61]. Seven of them are nonlin-
g4 (⃗x) = x4 − 240 ≤ 0, ear constraints and the remains of them are linear inequal-
ity constraints. These four parameters are bending stress of
Variable range: 0 ≤ x1 ≤ 99, the gear teeth, surface stress, transverse deflections of the
shafts, and stresses in the shafts. In addition, this problem
has seven variables, face width b(x1), module of teeth m(x2),
0 ≤ x2 ≤ 99,
the number of teeth in the pinion z(x3), length of the first
shaft between bearings l1(x4), length of the second shaft
10 ≤ x3 ≤ 200, between bearings l2(x5), the diameter of first shafts d1(x6),
and the diameter of second shafts d2(x7). The equation of this
10 ≤ x4 ≤ 200 problem is given as follows:
( )
Minimize ∶ f b, m, z, l1 , l2 , d1 , d2
( )
= 0.7854x1 x22 3.3333x32 + 14.9334x3 − 43.0934
4.4 Piston lever problem ( ) ( )
− 1.508x1 x62 + x72 + 7.4777 x63 + x73
( )
The main objective is to locate the piston components, H + 0.7854 x4 x62 + x5 x72
(x1), B (x2), D (x3), and X (x4) by minimizing the oil volume
when the lever of the piston is lifted up from 0° to 45° as 27 397.5
shown in Fig. 13 [60]. The equation of this problem is given Subject to: G1 = − 1 ≤ 0; G2 = − 1 ≤ 0;
x1 x22 x3 x1 x22 x32
as follows:
1.93x43
1 ( ) G3 = − 1 ≤ 0;
Minimize: f (H, B, D, X) = 𝜋D2 L2 − L1 x2 x64 x3
4
1.93x53
G4 = − 1 ≤ 0;
Subject to. ∶G1 = QL cos 𝜃 − RF ≤ 0; G2 = Q(L − X) − MMax ≤ 0; x2 x74 x3
( ) D
G3 = 1.2 L2 − L1 − L1 ≤ 0; G4 = − B ≤ 0;
2 √( )2
745x4
where x2 x3
+ 16.9 × 106
G5 = − 1 ≤ 0;
�−X(X sin 𝜃 + H) + H(B − X cos 𝜃)� 𝜋PD2 110x63
R= √ ,F = , √(
(X − B)2 + H 2 4 )2
� � 745x5
x2 x3
+ 157.5 × 106
L1 = (X − B)2 + H 2 , L2 = (X sin 𝜃 + H 2 ) + (B − X cos 𝜃)2 G6 = − 1 ≤ 0;
85x73
𝜃 = 45;Q = 10, 000 lbs; M = 1.8 × 106 lbs; P = 1500 psi; L = 240;
x2 x3 5x x
G7 = − 1 ≤ 0; G8 = 2 − 1 ≤ 0; G9 = 1 − 1 ≤ 0;
Range: 0.05 ≤ x1, x2, x4 ≤ 500; ∶ 0.05 ≤ x3 ≤ 120. 40 x1 12x2

1.5x6 + 1.9 1.1x7 + 1.9


These inequality constraints consider the force equilib- G10 =
x4
− 1 ≤ 0; G11 =
x5
− 1 ≤ 0;
rium, the maximum bending moment of the lever, minimum
piston stroke, and geometrical conditions. The best solutions
obtained by the SCSO, and the other candidate algorithms Range: 2.6 ≤ x1 ≤ 3.6;0.7 ≤ x2 ≤ 0.8;17 ≤ x3 ≤ 28;7.3 ≤
are presented in Table 12. x4 ;x5 ≤ 8.3;2.9 ≤ x6 ≤ 3.9;5 ≤ x1 ≤ 5.5;

13
Engineering with Computers

Fig. 9  The convergence curve analysis of each algorithm in some test functions

According to the simulation results, the proposed algo- 4.6 Three‑bar truss design problem
rithm outperformed better than other used algorithms in
finding the optimum cost. Their results are presented in This design considers a three-bar planar truss structure as
Table 13. shown in Fig. 15. The goal is to minimize relevant weights.
This case includes two optimization variables (A1 (= x1),
A2(= x2)) with three optimization constraints: stress, deflec-
tion, and buckling. This example is reported in [62] as a

13
Engineering with Computers

Table 8  Pair-wise comparison with SCSO and all compared algorithms


Func SAND vs CSO SAND vs GWO SAND vs WOA SAND vs SSA SAND vs GSA SAND vs PSO SAND vs BWO
P value h P value h P value h P value h P value h P value h P value h

F1 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +


F2 1.73E−06 + 1.73E−06 + 8.94E−01 − 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F3 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F4 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F5 1.73E−06 + 5.31E−05 − 1.85E−01 + 3.52E−06 + 6.64E−04 + 1.73E−06 + 1.73E−06 +
F6 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 − 1.73E−06 + 1.73E−06 +
F7 1.73E−06 + 1.73E−06 + 3.18E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F8 9.32E−06 + 1.17E−02 + 1.73E−06 − 5.29E−04 + 1.73E−06 + 2.60E−06 + 1.73E−06 +
F9 1.73E−06 + 2.06E−02 + 1 ~ 1.73E−06 + 3.09E−01 + 4.49E−02 + 2.07E−02 +
F10 1.73E−06 + 1.63E−06 + 3.53E−05 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F11 1.73E−06 + 3.35E−03 + 3.17E−01 + 1.73E−06 + 1.73E−06 + 1.73E−06 + 1.73E−06 +
F12 1.73E−06 + 2.13E−06 + 1.02E−05 + 1.73E−06 + 1.73E−06 + 1.36E−01 + 1.73E−06 −
F13 1.73E−06 + 1.73E−06 + 1.73E−06 + 4.29E−06 + 2.60E−05 + 1.73E−06 + 1.73E−06 −
F14 1.48E−02 − 1.25E−02 + 5.71E−02 + 4.86E−05 + 1.71E−03 + 1.73E−06 + 3.72E−05 +
F15 1.48E−02 + 8.77E−01 + 1.25E−02 + 1.64E−05 + 1.73E−06 + 9.27E−03 + 1.74E−04 +
F16 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~
F17 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~
F18 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~
F19 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~ 1 ~
F20 1.48E−02 − 1 ~ 1.73E−06 + 1.73E−06 + 1.73E−06 − 1.73E−06 − 5.04E−01 +

Table 9  Comparison results for welded beam design problem for all
algorithms
Algorithms Optimum variables Optimum cost
h l t b

SCSO 0.2057 3.2530 9.0366 0.2057 1.6952


CSO 0.2044 3.3125 8.9941 0.2108 1.7321
GWO 0.2029 3.3172 9.0404 0.2058 1.7100
WOA 0.2919 3.5048 4.9972 0.6728 3.1610
SSA 0.1734 3.9339 9.0565 0.2056 1.7374
GSA 0.1545 4.6516 9.5467 0.2553 2.3093
PSO 0.2059 3.2508 9.0332 0.2059 1.6963
BWO 0.5584 2.5582 5.5655 0.6067 3.5709

Fig. 10  Design parameters of the welded beam design problem The best algorithm has been bold


2x1 + x2
highly constrained search space. The equation of this prob- Subject to: G1 = √ P − 𝜎 ≤ 0;
lem is given as follows: 2x12 + 2x1 x2
� √ � x2
G2 = √ P − 𝜎 ≤ 0;
Minimize: f (A1, A2) = l × 2 2x1 + x2 2x12 + 2x1 x2
1
G3 = √ P − 𝜎 ≤ 0;
2x2 + x1

13
Engineering with Computers

Fig. 11  Design parameters of the tension/compression spring design problem

Table 10  Comparison results for tension/compression spring design Table 11  Comparison results for pressure vessel design problem for
problem for all algorithms all algorithms
Algorithms Optimum variables Optimum cost Algorithms Optimum variables Optimum cost
d D N Ts Th R L

SCSO 0.0500 0.3175 14.0200 1.2717020E−02 SCSO 0.7798 0.9390 40.3864 199.2918 5917.46
CSO 0.0671 0.8482 2.4074 1.6829585E−02 CSO 0.9953 0.4922 51.5106 87.1774 6389.35
GWO 0.0500 0.3174 14.0373 1.2727747E−02 GWO 0.8055 0.3992 41.7309 181.2937 5938.51
WOA 0.0554 0.4526 7.2886 1.2901922E−02 WOA 0.8093 1.2151 41.1110 189.2692 8497.55
SSA 0.0500 0.3122 14.7463 1.3069754E−02 SSA 0.9101 0.4499 47.1576 122.6549 6152.18
GSA 0.0606 0.2749 4.8674 1.7762975E−02 GSA 1.0921 14.1349 56.5865 71.8650 84,851.85
PSO 0.0500 0.3175 14.0373 1.2717021E−02 PSO 1.0206 0.5045 52.8803 77.0186 6442.20
BWO 0.0500 0.3122 14.7963 1.3109512E−02 BWO 4.0618 20.3225 58.7254 77.2508 159,345.5

The best algorithm has been bold The best algorithm has been bold

The obtained results, which are presented in Table 14,


Where l = 100 cm; P =
2 kN
;𝜎 =
2 kN demonstrate that the SCSO algorithm finds a minimum
cm2 cm2 cost design value in comparison to other metaheuristic
algorithms.
Range: 0 ≤ x1 , x2 ≤ 1.

Fig. 12  Design parameters of the pressure vessel design problem

13
Engineering with Computers

the performance of the SCSO algorithm is better than other


used metaheuristic algorithms at finding the optimum cost.
( )
Minimize: f (X) = 0.0624 x1 + x2 + x3 + x4 + x5

61 37 19 7 1
Subject to: G(X) = 3
+ 3 + 3 + 3 + 3 − 1 ≤ 0.
x1 x1 x3 x4 x5

Range: 0.01 ≤ xi ≤ 100; i ∈ 1, … , 5.

5 Conclusions and future works

In this paper, a new population-based metaheuristic algo-


rithm was proposed. This algorithm, which was called Sand
Cat Swarm Optimization (SCSO), was inspired by the nature
of sand cats. It has been assumed with the concept of the
herd and is a swarm intelligence optimization algorithm.
In the first place, the swarm cats start to search the whole
area, then they try to exploit the best current area that they
Fig. 13  The structure Piston lever problem find. This algorithm performed well in finding good solu-
tions with fewer parameters and operations. While ensuring
the balance between the exploration and exploitation phases
Table 12  Comparison results for piston lever problem for all algo-
with the R parameter, success performance in operations
rithms of both phases was with the r parameter. In the proposed
algorithm, the relevant search agents were directed to the
Algorithms Optimum variables Optimum cost
optima using the angle as well as the r parameter during the
X1 X2 X3 X4 exploitation phase. In this study, the low-frequency hearing
SCSO 0.050 2.040 4.083 119.99 8.40901438899551 ability of sand cats was inspired as a control parameter. In
CSO 0.050 2.399 4.804 85.68 13.7094866557362 mathematical modeling, the control parameter is presented
GWO 0.060 2.0390 4.083 120 8.40908765909047 by rG. Its value is linearly decreased from two to zero as the
WOA 0.099 2.057 4.112 118.14 9.05943208079399 iterations progress according to the working mechanism of
SSA 0.050 2.073 4.145 116.32 8.80243253777633 the proposed algorithm to approach the hunt it is looking
GSA 497.49 500 2.215 60.041 168.094363238712 for and not to lose or pass it (not to move away). The rel-
PSO 500 500 1.882 59.999 159.301622646318 evant parameter proves the flexibility and versatility of the
BWO 12.364 12.801 3.074 172.02 95.9980864948937 method. Adaptive tuned, the rG vector causes local optima
avoidance and a faster and accurate convergence curve,
The best algorithm has been bold simultaneously. Therefore, the algorithm will be able to do
all its tasks exactly in the total chance and iteration time. In
the working mechanism of the proposed algorithm, cats are
4.7 Cantilever beam design problem
directed towards the cat with the best fitness value found
in each iteration, and new areas for exploitation are sought
The last engineering problem is the cantilever beam design
globally in the exploration phase. In summary, the SCSO
problem [63]. It is a structural engineering design example
escapes from local optimum traps and is easy to apply. The
that is related to the weight optimization of a cantilever beam
SCSO is advantageous in this respect, and randomness is
with a square cross-section [42, 63]. The beam is rigidly sup-
realized by a mathematical model easily and cost-effective
ported at node 1, and there is a given vertical force acting at
rules. In addition, the convergence behavior of the proposed
node 6 (Fig. 16). The beam consists of five variables: heights
algorithm was analyzed. Briefly, defined random and adap-
(or widths) of the different beam elements and the thickness
tive parameters support the exploitation and convergence
is held fixed (here t = 2/3). The equation of this problem is
rate in a balanced way as the iteration counter increases.
given as follows. Based on the results obtained (Table 15),

13
Engineering with Computers

Table 13  Comparison results Algorithms Optimum variables Optimum cost


for speed reducer design
problem for all algorithms x1 x2 x3 x4 x5 x6 x7

SCSO 3.47 0.70 17 7.88 7.80 3.34 5.27 2991.09771580958


CSO 3.48 0.70 17 8.26 7.95 3.34 5.28 2997.70741533625
GWO 3.48 0.70 17 8.29 7.84 3.35 5.28 2995.49707279355
WOA 3.48 0.70 17 7.30 7.80 3.34 5.28 2997.98729873411
SSA 3.48 0.70 17 7.59 8.17 3.34 5.28 2996.41880525515
GSA 3.48 0.70 17 8.15 8.12 3.34 5.28 3000.36632738096
PSO 3.48 0.70 17 7.30 8.30 3.34 5.28 2996.72111429286
BWO 3.58 0.72 18.28 7.73 7.73 3.43 5.28 3417.15347894919

The best algorithm has been bold

Fig. 14  The structure speed reducer

The proposed algorithm was simulated on 30 benchmark The SCSO algorithm outperformed in 19 functions of a total
functions, also applied in seven engineering optimization of 30 test functions and ranked first. The BWO algorithm,
problems, and in all of them, the results were compared with which ranks second in these functions, took to find the best
seven different well-known metaheuristic algorithms (CSO, solution in only seven functions in total. This displays the
GWO, WOA, SSA, GSA, PSO, and BWO) and analyzed. superior success rate of the SCSO algorithm. Moreover, the
According to the results, SCSO performed more successfully SCSO algorithm achieved the best results in all seven engi-
and achieved the best results overall among all 8 algorithms. neering problems focused on.

Table 14  Comparison results Algorithms Optimum variables Optimum cost


for three-bar truss design
problem for all algorithms X1 X2

SCSO 0.786098941917387 0.406914764228565 263.463430505945


CSO 0.765698110495407 0.467907820151335 263.793582109035
GWO 0.786421822208503 0.405927167232469 263.463551492204
WOA 0.799266262606330 0.370890516767604 263.585682732223
SSA 0.788416031222806 0.408111465276111 263.852346429101
GSA 0.747070495056356 0.530675746732991 264.769804538555
PSO 0.788415193837924 0.408113863439769 263.852346428578
BWO 0.783079543980098 0.423422973994359 263.873679704273

The best algorithm has been bold

13
Engineering with Computers

Due to the balanced behavior of this algorithm between


phases, also its effort to find other possible local areas in the
global field, it is also predicted that it will perform well in
high-dimensions and multi-objective problems. Besides, it
can be used for composite and other engineering problems.
Therefore, it is planned to be used in such problems in future
studies. In addition, it can be hybridized with other stochas-
tic components, such as local or global search methods, in
the search area of the optimization problem to improve its
performance. Furthermore, it can be used for solving many
complex problems such as feature selection, image segmen-
tation, complex electrical circuits, scheduling and dispatch-
ing problems, 3D path planning in mobile robotics or con-
nected vehicle networks, and optimized node localization
in the systems. Moreover, it can also be applied to various
systems to solve various NP-hard problems in wireless sen-
sor networks, the Internet of Things, logistics, smart farm-
ing, bioinformatics, and machine learning-based applica-
tions. Another future work area is tuning. This issue can be
evaluated in two different ways. One can use the proposed
Fig. 15  The structure of three-bar truss algorithm to tune hyperparameters in most machine learning
methods. The other is to develop a novel mechanism to tune
the parameters of this algorithm proposed.

Fig. 16  The structure of cantilever beam

Table 15  Comparison results for cantilever beam design problem for all algorithms
Algorithms Optimum variables Optimum cost
x1 x2 x3 x4 x5

SCSO 6.0164 5.3060 4.4935 3.5059 2.1516 1.33995240092402


CSO 6.7628 5.1583 5.6537 2.9279 1.8854 1.39702393412684
GWO 6.0103 5.3557 4.4827 3.5022 2.1248 1.34005895149258
WOA 5.1261 5.6188 5.0952 3.9329 2.3219 1.37873150673956
SSA 6.0143 5.3049 4.4987 3.5039 2.1517 1.33995265490689
GSA 5.6052 4.9553 5.6619 3.1959 3.2026 1.41155753917296
PSO 6.0040 5.2950 4.4915 3.5125 2.1710 1.33998298081255
BWO 6.2094 6.2094 6.2094 6.2094 6.2094 1.93736251728534

The best algorithm has been bold

13
Engineering with Computers

Supplementary Information The online version contains supplemen- 20. Hatamlou A (2013) Black hole: a new heuristic optimization
tary material available at https://d​ oi.o​ rg/1​ 0.1​ 007/s​ 00366-0​ 22-0​ 1604-x. approach for data clustering. Inf Sci 222:175–184
21. Kaveh A, Talatahari S (2010) A novel heuristic optimization
Author contributions AS conceptualization, investigation, methodol- method: charged system search. Acta Mech 213(3):267–289
ogy, software, validation, formal analysis, original draft, writing— 22. Moghaddam FF, Moghaddam RF, Cheriet M (2012) Curved space
review and editing. FK conceptualization, supervision, project admin- optimization: a random search based on general relativity theory.
istration, methodology, writing—review and editing. http://​arxiv.​org/​abs/​1208.​2214
23. Formato RA (2007) Central force optimization: a new metaheuris-
tic with applications in applied electromagnetics. Prog Electromag
Declarations Res 77:425–491
24. Shah-Hosseini H (2011) Principal components analysis by the
Conflict of interest The authors declare no conflict of interest. galaxy-based search algorithm: a novel metaheuristic for continu-
ous optimisation. Int J Comput Sci Eng 6(1–2):132–140
25. Kennedy J, Eberhart R (1995) Particle swarm optimization. In:
Proceedings of ICNN'95-international conference on neural net-
References works, vol 4. IEEE, pp 1942–1948
26. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization.
1. Jamil M, Xin-She Y (2013) A literature survey of benchmark IEEE Comput Intell Mag 1(4):28–39
functions for global optimization problems. http://​arxiv.​org/​abs/​ 27. Okdem S, Karaboga D (2009) Routing in wireless sensor net-
1308.​4008 works using an ant colony optimization (ACO) router chip. Sen-
2. Talbi EG (2009) Metaheuristics: from design to implementation, sors 9(2):909–921
vol 74. Wiley, New York, pp 5–39 28. Seyyedabbasi A, Kiani F (2020) MAP-ACO: an efficient protocol
3. Tang C, Zhou Y, Tang Z et al (2021) Teaching-learning-based for multi-agent pathfinding in real-time WSN and decentralized
pathfinder algorithm for function and engineering optimization IoT systems. Microprocess Microsyst 79(103325):1–9
problems. Appl Intell 51:5040–5066 29. Karaboga D, Basturk B (2007) Artificial bee colony (ABC) opti-
4. Wolpert DH, Macready WG et al (1997) No free lunch theorems mization algorithm for solving constrained optimization problems.
for optimization. IEEE Trans Evol Comput 1(1):67–82 International fuzzy systems association world congress. Springer,
5. Kiani F, Seyyedabbasi A, Nematzadeh S (2021) Improving the Berlin, pp 789–798
performance of hierarchical wireless sensor networks using the 30. Yang XS (2010) A new metaheuristic bat-inspired algorithm.
metaheuristic algorithms: efficient cluster head selection. Sens Nature inspired cooperative strategies for optimization (NICSO
Rev 1–14 2010). Springer, Berlin, pp 65–74
6. Kaveh A (2017) Applications of metaheuristic optimization algo- 31. Yang XS (2009) Firefly algorithms for multimodal optimization.
rithms in civil engineering. Springer International Publishing, International symposium on stochastic algorithms. Springer, Ber-
Basel. https://​doi.​org/​10.​1007/​978-3-​319-​48012-1 lin, pp 169–178
7. Kiani F, Seyyedabbasi A, Mahouti P (2021) Optimal characteriza- 32. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer.
tion of a microwave transistor using grey wolf algorithms. Analog Adv Eng Softw 69:46–61
Integr Circ Sig Process 109:599–609 33. Seyyedabbasi A, Kiani F (2021) I-GWO and Ex-GWO: improved
8. Can U, Alatas B (2015) Physics based metaheuristic algorithms algorithms of the grey wolf optimizer to solve global optimization
for global optimization. Am J Inf Sci Comput Eng 1(3):94–106 problems. Eng Comput 37:509–532
9. Back T (1996) Evolutionary algorithms in theory and practice: 34. Mirjalili S, Lewis A (2016) The whale optimization algorithm.
evolution strategies, evolutionary programming, genetic algo- Adv Eng Softw 95:51–67
rithms. Oxford University Press, Oxford 35. Mirjalili S (2016) Dragonfly algorithm: a new metaheuristic
10. Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–73 optimization technique for solving single objective, discrete, and
11. Storn R, Price K (1997) Differential evolution—a simple and effi- multi-objective problems. Neural Comput Appl 27(4):1053–1073
cient heuristic for global optimization over continuous spaces. J 36. Yang XS, Deb S (2009) Cuckoo search via Lévy flights. In: 2009
Glob Optim 11(4):341–359 world congress on nature & biologically inspired computing
12. Cai X, Zhao H, Shang Sh, Zhou Y et al (2021) An improved (NaBIC). IEEE, pp 210–214
quantum-inspired cooperative co-evolution algorithm with muli- 37. Arora S, Singh S (2019) Butterfly optimization algorithm: a novel
strategy and its application. Expert Syst Appl 121:1–13 approach for global optimization. Soft Comput 23(3):715–734
13. Glover F (1990) Tabu search: a tutorial. Inf J Appl Anal 38. Bayraktar Z, Komurcu M, Werner DH (2010) Wind driven opti-
20(4):75–94 mization (WDO): a novel nature-inspired optimization algorithm
14. Yao X, Liu Y, Lin G (1999) Evolutionary programming made and its application to electromagnetics. In: IEEE Antennas and
faster. IEEE Trans Evol Comput 3(2):82–102 propagation society Internation symposium (APSURSI), pp 1–4
15. Simon D (2008) Biogeography-based optimization. IEEE Trans 39. Chu SC, Tsai PW, Pan JS (2006) Cat swarm optimization. In:
Evol Comput 12(6):702–713 Pacific Rim international conference on artificial intelligence.
16. Hayyolalam V, Kazem AAP (2020) Black widow optimization Springer, Berlin, pp 854–858
algorithm: a novel metaheuristic approach for solving engineering 40. Pan WT (2012) A new fruit fly optimization algorithm: taking
optimization problems. Eng Appl Artif Intell 87(103249):1–28 the financial distress model as an example. Knowl Based Syst
17. Webster B, Bernhard PJ (2003) A local search optimization algo- 26:69–74
rithm based on natural principles of gravitation. Florida Institute 41. Yang XS (2012) Flower pollination algorithm for global optimiza-
of Technology, Technical Reports, pp 1–19 tion. In: Unconventional computation and natural computation,
18. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravi- lecture notes in computer science, vol 7445, pp 240–249
tational search algorithm. Inf Sci 179(13):2232–2248 42. Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mir-
19. Erol OK, Eksin I (2006) A new optimization method: big bang– jalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer
big crunch. Adv Eng Softw 37(2):106–111 for engineering design problems. Adv Eng Softw 114:163–191

13
Engineering with Computers

43. Tang C, Zhou Y, Luo Q et al (2021) An enhanced pathfinder 53. Yao X, Liu Y, Lin G (1999) Evolutionary programming made
algorithm for engineering optimization problems. Eng Comput. faster. Evolut Comput IEEE Trans 3:82–102
https://​doi.​org/​10.​1007/​s00366-​021-​01286-x 54. Seyyedabbasi A, Aliyev R, Kiani F, Gulle M, Basyildiz H, Shah
44. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H M (2021) Hybrid algorithms based on combining reinforcement
(2019) Harris hawks optimization: algorithm and applications. learning and metaheuristic methods to solve global optimization
Future Gener Comput Syst 97:849–872 problems. Knowl Based Syst 223:1–22
45. Zhong L, Zhou Y, Luo Q, Zhong K (2021) Wind driven dragonfly 55. Molga M, Smutnicki C (2005) Test functions for optimization
algorithm for global optimization. Concurr Comput Pract Exp needs
33(6):e6054 56. Jamil M, Yang X (2013) A literature survey of benchmark func-
46. Wang Z, Luo Q, Zhou Y (2021) Hybrid metaheuristic algo- tions for global optimization problems. Int J Math Model Numer
rithm using butterfly and flower pollination base on mutual- Optim 4(2):1–47
ism mechanism for global optimization problems. Eng Comput 57. Van den Bergh F, Engelbrecht AP (2006) A study of particle
37:3665–3698 swarm optimization particle trajectories. Inf Sci 176(8):937–971
47. Cole FR, Wilson DE (2015) Felis margarita (Carnivora: Felidae). 58. Coello CAC (2000) Use of a self-adaptive penalty approach for
Mamm Species 47(924):63–77 engineering optimization problems. Comput Ind 41(2):113–127
48. Huang G, Rosowski J, Ravicz M, Peake W (2002) Mammalian 59. Chattopadhyay S (2004) Pressure vessels: design and practice, 1st
ear specializations in arid habitats: structural and functional edn. CRC Press, Boca Raton. https://d​ oi.o​ rg/1​ 0.1​ 201/9​ 78020​ 3492​
evidence from sand cat (Felis margarita). J Comp Physiol A 468
188(9):663–681 60. Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search algo-
49. Abbadi M (1989) Radiotelemetric observations on sand cats (Felis rithm: a metaheuristic approach to solve structural optimization
margarita) in the Arava Valley. Isr J Zool 36:155–156 problems. Eng Comput 29:17–35
50. Liang JJ, Qu BY, Suganthan PN (2013) Problem definitions and 61. Bayzidi H, Talatahari S, Saraee M, Lamarche CP (2021) Social
evaluation criteria for the CEC 2014 special session and competi- network search for solving engineering optimization problems.
tion on single objective real-parameter numerical optimization, Comput Intell Neurosci
vol 635. Computational Intelligence Laboratory, Zhengzhou Uni- 62. Nowcki H (1974) Optimization in pre-contract ship design. In:
versity, Zhengzhou China and Technical Report, Nanyang Tech- Fujita Y, Lind K, Williams TJ (eds) Computer applications in the
nological University, Singapore, pp 1–32 automation of shipyard operation and ship design, vol 2. North
51. Liang JJ, Qu BY, Suganthan PN, Chen Q (2014) Problem defini- Holland, Elsevier, New York, pp 327–338
tions and evaluation criteria for the CEC 2015 competition on 63. Chickermane H, Gea HC (1996) Structural optimization using
learning-based real-parameter single objective optimization, a new local approximation method. Int J Numer Methods Eng
vol 29. Technical Report201411A. Computational Intelligence 39(5):829–846
Laboratory, Zhengzhou University, Zhengzhou China and Tech-
nical Report, Nanyang Technological University, Singapore, pp Publisher's Note Springer Nature remains neutral with regard to
625–640 jurisdictional claims in published maps and institutional affiliations.
52. Price KV, Awad NH, Ali MZ, Suganthan PN (2018) The 100-digit
challenge: problem definitions and evaluation criteria for the 100-
digit challenge special session and competition on single objective
numerical optimization. Nanyang Technological University

13

View publication stats

You might also like