Sand Cat Optimization
Sand Cat Optimization
net/publication/359877495
CITATIONS READS
291 6,088
2 authors:
All content following this page was uploaded by Farzad Kiani on 12 April 2022.
ORIGINAL ARTICLE
Abstract
This study proposes a new metaheuristic algorithm called sand cat swarm optimization (SCSO) which mimics the sand cat
behavior that tries to survive in nature. These cats are able to detect low frequencies below 2 kHz and also have an incred-
ible ability to dig for prey. The proposed algorithm, inspired by these two features, consists of two main phases (search
and attack). This algorithm controls the transitions in the exploration and exploitation phases in a balanced manner and
performed well in finding good solutions with fewer parameters and operations. It is carried out by finding the direction and
speed of the appropriate movements with the defined adaptive strategy. The SCSO algorithm is tested with 20 well-known
along with modern 10 complex test functions of CEC2019 benchmark functions and the obtained results are also compared
with famous metaheuristic algorithms. According to the results, the algorithm that found the best solution in 63.3% of the
test functions is SCSO. Moreover, the SCSO algorithm is applied to seven challenging engineering design problems such as
welded beam design, tension/compression spring design, pressure vessel design, piston lever, speed reducer design, three-bar
truss design, and cantilever beam design. The obtained results show that the SCSO performs successfully on convergence
rate and in locating all or most of the local/global optima and outperforms other compared methods.
13
Vol.:(0123456789)
Engineering with Computers
problem. As such, there is a considerable demand to develop phenomena [20], is another algorithm in the physics-based
new metaheuristic algorithms that can be used in various category. Other famous works in this category are charged
problems. Therefore, due to its wide area of use and advan- system search (CSS) [21], curved space optimization [22],
tages, in recent years, metaheuristic optimization algorithms central force optimization (CFO) [23], and galaxy-based
become popular in many fields of science [5–7]. In general, search algorithm (GbSA) [24]. In the methods in this cat-
the metaheuristic algorithms are inspired by biological or egory, search agents act randomly according to the assumed
physical phenomena [8] and try to find an optimum solution physical events.
(maximum or minimum) in a reasonable time. The swarm-intelligence (SI) is the third category of the
The metaheuristic algorithms are generally divided into metaheuristic algorithms. Generally, the SI methods are
two classes as single-solution and population-based (mul- inspired by the social behaviors of the animals that live
tiple) algorithms. In single solution algorithms, while a in swarms, held, and flocks in nature. In this kind of algo-
single solution influences the output, the entire population rithms, the search agents try to find an optimal solution by
is included in the population-based algorithm. In addition, influencing social intelligence. Particle Swarm Optimiza-
the metaheuristic algorithms are classified into evolution- tion (PSO) is the most famous algorithm in this category
ary, physics-based, and swarm intelligence algorithms. The [25]. The PSO has been inspired by the birds’ movement and
evolutionary algorithms (EAs) are inspired by natural evolu- social behaviors in nature. The particle indicates candidate
tion and the theory of Darwin evolution [9]. EAs are solving solutions, and the algorithm tries to find the best solution
a problem within a random search space and the EA is a from the search space based on these particles. Ant colony
population-based method, in which the whole population optimization (ACO) is another algorithm in this category
affects the optimum solution. The genetic algorithm (GA) [26]. ACO algorithm simulates the foraging behaviors of
is a well-known algorithm in the EA category [10], which ants. The main goal of an ant in nature is to find the destina-
is inspired by generation reproduction. The GA generates tion (food) and generate an optimal (secure and minimum
new generations by mimicking the crossover, mutation, and cost) path between source and destination. In the literature,
elitism to find global optima. Differential evolution (DE) this algorithm has been used to solve many types of prob-
algorithm is another algorithm that is motivated by natural lems [27, 28]. Artificial bee colony (ABC) [29] algorithm is
evolution. The DE algorithm has a difference from the GA in another important algorithm that mimics the social behavior
the selection operation to generate next generation [11]. The of honeybees. In the ABC discovering, a path is generated
improved quantum-inspired cooperative coevolution algo- between destination (nest) and source (food). Searching for a
rithm (MSQCCEA) [12] is one of the current studies. This rich food resource and bees experience are important phases
algorithm improves the global search capability and does not in the ABC algorithm to find the optimal solution. Other
fall into the local optimal trap. Accordingly, a quantum spin algorithms in this category are bat algorithm (BA) [30],
direction strategy has been developed to change the quan- firefly algorithm (FA) [31], grey wolf optimization (GWO)
tum evolution direction from one to two. It was applied to [32], and different variance [33], whale optimization algo-
knapsack and the actual airport gate allocation problems and rithm (WOA) [34], dragonfly algorithm (DA) [35], cuckoo
their results show the good performance of the introduced search (CS) [36], butterfly optimization algorithm (BOA)
algorithm on fast and accurate convergence rate. The Tabu [37], wind-driven optimization (WDO) [38], cat swarm
search (TS) algorithm is a different example of evolutionary optimization (CSO) [39], fruit fly optimization algorithm
algorithms [13]. Evolutionary programming (EP) is inspired (FFOA) [40], and flower pollination algorithm (FPA) [41],
by behavioral models such as phenotype, hereditary, and Salp swarm algorithm (SSA) [42], Pathfinder algorithm
variation [14]. The biogeography-based optimizer (BBO) (PFA) [43] and its improved version [3], and Harris hawks
[15] and black widow optimization (BWO) [16] are another optimization (HHO) Algorithm [44]. In addition, hybrid
example of EAs. metaheuristic algorithms have become widespread in recent
The physics-based algorithms are another family mem- years. Two of the valuable studies done in this context is
ber of metaheuristic methods. These algorithms are inspired presented in [45, 46]. In [45], authors present a novel method
by the physics rules in nature and act randomly according based on the DA and the WDO algorithms. The authors cor-
to the assumed physical events. In this kind of algorithms, rected the shortcomings of these two algorithms, and as a
the search space and optimal solution follow physical rules result, they proved their proposed method has fast conver-
such as electromagnetic force, gravitational force, and iner- gence speed and strong global search and is accurate in find-
tial force. There are famous optimization algorithms in this ing solutions. In [46], the authors were inspired by BOA and
category such as the gravitational local search (GLSA) FPA algorithms and proposed a flexible algorithm to solve
[17], the gravitational search algorithm (GSA) [18], and the global optimization problems. The authors suggested a
the Big Bang–Big crunch (BBBC) [19]. The Black Hole new mechanism for efficient switching between exploration
(BH) algorithm, which is inspired by the fact of black hole
13
Engineering with Computers
and exploitation phases, and thereby the algorithm exhibits results show that SCSO algorithm is more successful in
balanced behavior. solving complex optimization problems.
The main phases of metaheuristic algorithms are explora- 3. The proposed SCSO algorithm is able to escape from the
tion and exploitation, and each algorithm has specific strate- local optima trap and has a suitable and balanced behav-
gies to realize the concepts of these phases. The more bal- ior between the exploitation and exploration phases in
ance between these two phases in the proposed algorithms comparison with other investigated algorithms.
can have a higher success rate. In these strategies, the ran- 4. The SCSO has fewer parameters and operators compared
domness and the appropriate coefficients defined are impor- to other metaheuristic algorithms.
tant. It is very important to set the relevant coefficient param- 5. The SCSO is easier to implement than the other
eters properly to ensure a good balance between these two metaheuristic algorithms.
phases [33]. Exploration means search in the global. This
phase needs more search, so the relevant algorithm selects The rest of this paper is organized as follows. Section 2
randomly a solution(s) from the search space. Whereas the explains the sand cat behaviors in-depth and the mathemati-
exploitation phase comes after the exploration phase, and the cal model of the SCSO. Section 3 presents the simulation
exploitation phase focuses on the solution(s) on the search results and discussions of benchmark test functions. Sec-
space to improve the solution(s). In other words, unvisited tion 4 explains the results on seven engineering optimization
areas are checked whether they are potential escape areas problems. The conclusion and future works take place in the
from poor local optima. The randomness of the search in last section.
exploration affects the algorithm to avoid trapping on the
local optima to explore the global optimum.
In this paper, a new metaheuristic optimization algorithm 2 Sand cat swarm optimization (SCSO)
has been proposed which may be more suitable for solv-
ing various problems that behave balance between related This section describes the sand cats, which were the inspira-
phases according to the NFL theorem. Also, the proposed tion for the proposed algorithm first. Then, the mathematical
algorithm has a more favorable complexity than other cur- model and working mechanism of the proposed method are
rent metaheuristic algorithms. The proposed algorithm is presented.
inspired by the search and hunting behavior of the sand cat
and is named sand cat swarm optimization (SCSO). The 2.1 Inspiration: sand cats in nature
sand cats live alone in nature but are considered a herd
that can be identified by the user as a search agent in the Sand cat (Felis margarita) is one kind of the Felis animals
proposed algorithm. In the development of the proposed that it is a family of mammals. The sand cat lives in the
algorithm, the focus is on the low-frequency noise detec- harsh environment in sandy and stony deserts such as cen-
tion behavior of sand cats to find prey. These cats detect low tral Asia Sahara, African Sahara, and Arabian Peninsula.
frequencies below 2 kHz and thus they may catch prey at Small, nimble, and humble sand cat has different life behav-
long distances in a possible short time and with little action. ior in hunting and living. The living behavior of a sand cat is
Moreover, sand cats have an incredible ability to dig for unlike the domestic cat although, there is no big difference in
prey. Thanks to these two wonderful features, it is ensured appearance between a sand cat and a domestic cat. The sand
that sand cats have a special ability in searching and hunting cats do not live in a group like many felines. The sand cat’s
in nature. The proposed algorithm also behaves in balance palms and soles of the feet are covered with a higher density
in the exploration and exploitation phases with an adaptive of sandy to light grey fur. The fur coverage of the soles of
mechanism. Moreover, the key contributions of this paper feet insulates feet pads against the desert’s harsh hot and cold
are summarized below. weather. Furthermore, the fur characteristics of the sand cat
make detection and tracking indistinct and difficult. The sand
1. A new population-based optimization algorithm called cat's body length is about 45–57 cm. Its tail is about half of
SCSO mimics the searching and hunting behavior of the the head and body length (28–35 cm) with short legs beside
sand cats. the frontier claws that are short and sharply curved. Facing,
2. Various tests have been carried out on 20 well-known the back claws are more elongated and weakly curved. The
benchmark functions, and 10 modern benchmark func- adult sand cat weight ranges from 1 to 3.5 kg. The ears on
tions (CEC2019) for performance evaluation of the the sides of the head have 5–7 cm long. The ears of the sand
proposed algorithm, as well as seven case studies as cat have a considerable role in foraging. The nocturnally,
engineering optimization problems. Its results are com- subterranean, and secretive nature of this cat has made it
pared with various other well-known metaheuristics special [47, 48].
algorithms to evaluate its performance. Experimental
13
Engineering with Computers
Finding food for each animal is hard in a harsh and pun- domestic cats. Also, the middle ear cavities and bone chains
ishing environment. The sand cats have difficulties in the influence in increasing the acoustic input-admittance. Scien-
deserts also, but they benefit from the cool night to find food. tific studies show that the frequency emission in the sand cat
During the day, they rest in the underground and try to hunt for frequencies below 2 kHz is incredible. In this frequency
at night usually. These cats spend most of their time on their sand cats are about 8 dB more sensitive than domestic cats
burrows but in the posture they lie on their back to release [47, 48]. These unique characteristics can be reason that
internal heat. Besides, to overcome thirst they derive water the sand cat detects noise (prey movement), track prey, and
in food. They usually eat about 10% of their body weight, successful attacks based on prey location (Fig. 1). The sand
but they can also eat more than normal. The male sand cats cat also has a curious ability to rapidly dig if the prey is
walk more than the female at night, averaging 5.5 km and underground. According to the behaviors of the sand cat,
3.2 km, respectively for male and female cats. In the winter there are two stages in foraging: searching and attacking
season, the walking average value is lower than in the sum- the prey. In the algorithm proposed in this paper (SCSO),
mer season [49]. Like the Felis family, the sand cat uses the these two stages are emphasized. In addition, a mechanism is
paws for hunting. They are fine hunters and eat small desert suggested for the realization of exploration and exploitation
rodents, reptiles, small birds, spiny mice, insects, and they phases and to achieve the balance between them.
are proficient snake hunters.
The hunting mechanism in sand cats is quite interest- 2.2 Mathematical model and optimization
ing. They use their fantastic sense of hearing and get low- algorithm
frequency noises. In this way, the sand cats detect the prey
(insects and rodents) moving underground. There is no dif- The Sand Cat Swarm Optimization (SCSO) algorithm has
ference between sand and domestic cats in the ear’s pinna been inspired by the sand cat behaviors in nature. Two main
flange (outer ear). In the case of the middle ear, the ear canal actions of the sand cat are foraging the prey and attacking
length in sand cats is longer than in domestic cats, which the prey. The proposed algorithm is inspired by the special
causes a large volume of air space in the middle ear. Fur- feature of the sand cat, which is the ability to detect low-
thermore, the sand cat can detect the difference in time-of- frequency noises. Sand cats can locate prey benefits their
arrival between different sounds. The amount of acoustic extraordinary feature whether on the ground or underground.
input-admittance of the ear in cats is related to the tym- Thanks to this important feature, it can find and catch its
panic membrane, which is 5 times more in sand cats than in prey quickly. Since the sand cat lives alone in nature, in the
(c) hunting
13
Engineering with Computers
Best-Cost
Sand Cat
….
….
….
n xn1 xn2 xnd Costn
In this section, the searching mechanism of the SCSO algo-
rithm is described. The prey search mechanism by sand cats
Sand Cati = [x1, x2,…., xd] ; i ℇ population (1,n)
relies on low frequency noise emission. ( The solution ) for
Fitness = (Sand Cat) = (x1, x2,…., xd) ; ∀xi (is calculated for n time)
each sand cat is represented as Xi = xi1 , xi2 , xi3 , … , xid . The
SCSO algorithm benefits from the hearing ability in low
frequencies detection of sand cats. In this way, the sensitivity
Fig. 2 Working mechanism of SCSO in initial and definition phase range for each cat is declared. As mentioned before, the sand
cat senses low-frequencies below 2 kHz. In mathematical
modeling, this value ( r���G⃗ ) will linearly decrease from two
proposed algorithm, the authors assumed sand cats as herds to zero as the iterations progress according to the working
to emphasize the concept of swarm intelligence. So, in the mechanism of the proposed algorithm to approach the hunt
algorithm initialization, the number of the sand cat can be it is looking for and not to lose or pass it (not to move away).
declared to optimize a minimization or maximization prob- Thus, to search for prey, it is assumed that the sand cat sen-
lem. For this, the first step is creating the initial population sitivity range starts from 2 kHz to 0 (Eq. on 1). Since the S M
and define the problem. value is inspired by the hearing characteristics of the sand
cats, its value is assumed to be 2. However, when solving
2.2.1 Initial population different problems, this value in determining the speed of
action of agents can be customized accordingly. This proves
In the solution of an optimization problem, the values of the flexibility and versatility of the equation presented. For
the relevant variables should be defined in accordance example, if the maximum number of iterations is 100, the
with the solution of the current problem. For example, in value of r���G⃗ will be greater than 1 in the first half iterations
PSO it is called a particle position [25] and in GWO it is and less than 1 in the last 50 iterations. It is worth men-
called a grey wolf position [32]. In our SCSO algorithm, tioning that the final and main parameter in controlling the
it is called sand cat and each cat shows the values of the transition between exploration and exploitation phases is R.
problem variables. In the proposed algorithm, which is a The R is a vector that it is obtained from Eq. 2. Thanks to
population-based method, the related structure is defined as this adaptive strategy, the transitions and possibilities in the
a vector. In a d dimensional optimization problem, a sand two phases will be more balanced.
cat is a 1 × d array representing the solution to the problem The search space is randomly initialed between the
and is defined as is shown in Fig. 2. Each of the variable defined boundaries. In the searching step, position updating
values (x1, x2, …, xd) is a floating-point number. Here every of each current search agent is based on a random position.
x must be located between the lower and upper boundaries In this way, the search agents able to explore new spaces in
(∀xi ∈ [lower, upper]). To start the SCSO algorithm, first, the search space. The sensitivity range for each sand cat is
a candidate matrix is created with the sand cat population different, to avoid the local optimum trap and it is realized
according to the size of the problem (Npop × Nd), (pop = 1, by Eq. 3. Therefore, the r���G⃗ indicates the general sensitiv-
…, n). ity range that is decreased linearly from 2 to 0. Besides, ⃗r
In addition, the fitness cost of each sand cat is obtained by demonstrates sensitivity range of each cat. The ⃗r is used for
evaluation of defined fitness function. This function defines operations in exploration or exploitation phases while r���G⃗ .
the relevant parameters of the problem, and the best values guides R parameter for transitions control in these phases.
of the parameters (variables) will be obtained by the SCSO. In addition, iterc is current iteration and iterMax is maximum
A value for the corresponding function will be output from iterations.
each sand cat. When an iteration is finished, the sand cat ( )
with the best cost in that iteration is chosen so far, the best 2 × SM × iterc
r���G⃗ = sM − , (1)
solution (if there was no answer as good as this in the previ- iterMax + itermax
ous iterations) and the other sand cats try to move towards
this best-chosen cat in the next iteration. Because the best ⃗ = 2 × r���G⃗ × rand(0, 1) − r���G⃗,
R (2)
solution in each iteration can represent the cat closest to the
13
Engineering with Computers
⃗r = r���G⃗ × rand(0, 1). (3) the algorithm to be low in operation cost and efficient in
complexity.
Each search agent (sand cat) updates its own position ( )
based on(the best-candidate position ( Pos
��������bc⃗) and its current ������⃗(t + 1) = ⃗r ⋅ Pos
Pos ��������bc⃗(t) − rand(0, 1) ⋅ Pos
�������⃗c (t) . (4)
)
�������⃗c and its sensitivity range ( ⃗r ). Therefore, the
position Pos
sand cats able to find the possible other best prey position
(Eq. 4). This equation gives another chance to the algorithm 2.2.3 Attacking on the prey (exploitation)
to find new local optima in the search area. Hence, the
obtained position is located between the current position and As mentioned before, the sand cats detect the prey based
the position of the prey. Moreover, this is achieved by ran- on their ears' ability. To mathematically model the attack-
domness, not by exact methods. In this way, the search ing phase of SCSO, the distance between the best position
agents in the algorithm benefit the randomness. This causes ( Pos
�������⃗b ) (best solution) and the current position ( Pos
�������⃗c ) of the
r
Ѳ
Ѳ R
R
r
Prey
Ѳ
r
r
R
R
Ѳ
r r
Ѳ
R
Prey
Ѳ Ѳ
r
R r
R
13
Engineering with Computers
sand cat is calculated by Eq. 5. Furthermore, the sand cat sen- SCSO to switch seamlessly between two phases. Since the
sitivity range is supposed as a circle, in this way, the direction R parameter depends on rG, its fluctuation range will be
of movement is determined by a random angle (𝜃) on the cir- also decreased. As stated before, when the values of the rG
cle. Of course, in specifying the direction of movement other parameter are distributed in a balanced way, the R value will
parameters are also affective that is declared in Eq. 5. Since also be well-balanced and therefore the chances of opera-
the chosen random angle is between 0 and 360, its value will tions between the two phases will be appropriate accord-
be between − 1 and 1. In this way, each member of popula- ing to the problem. In other words, R is a random value in
tion is able to move in a different circular direction of the the interval [− 2rG, 2rG] where rG is reduced from 2 to 0
search space. The SCSO benefits the Roulette Wheel selec- over iterations linearly. When the random values of R are in
tion algorithm to select a random angle for each sand cat. [− 1, 1], the next position of a sand cat can be at any posi-
In this way, the sand cat can approach the hunting position. tion between its current position and the hunt position. The
The random angle is also used to avoid the local optimum SCSO algorithm forces the search agents to exploit when
trap. Using the random angle in Eq. 5 will have a positive the R is lower or equal to 1, otherwise, the search agents
effect on the approach of the agents to the hunt and guides forced to explore and find prey. In searching the prey phase
them. Posrnd indicates the random position and ensures that (exploration), the different radius of each cat avoids the local
the cats involved can be close to prey. Position updating in optimum trap. This feature is one of the effective parameters
SCSO in two consecutive iterations is shown in Fig. 3. As the in attacking the prey (exploitation) too. Equation 6 shows
iterations progress, as it seems, the cats approach the hunt. In the position update for each sand cat in the exploration and
addition, the working example of SCSO from the beginning exploitation phases. As shown in Fig. 5, when |R| ≤ 1, sand
to the last iteration is shown representational in Fig. 4. cats are directed to attack their prey, otherwise, the cats are
tasked with finding a new possible solution in the global
Pos ⃗ |
��������� �������⃗
rnd = ||rand(0, 1) ⋅ Posb (t) − Posc (t)||,
�������⃗ | area. The balanced behavior of the proposed algorithm and
(5) the effort to find other possible local areas in the global
�����⃗ + 1) = Pos
Pos(t ���������
�������⃗b (t) − ⃗r ⋅ Pos ⃗
rnd ⋅ cos (𝜃).
space has a fast and accurate convergence rate, and there-
fore it is useful to perform well in high-dimensional and
multi-objective problems. The pseudocode and flowchart of
2.2.4 Exploration and exploitation proposed algorithm (SCSO) are given in Algorithm 1 and
Fig. 6, respectively. The open-source codes of SCSO algo-
Exploration and exploitation are guaranteed by the adap- rithm are available at https://www.mathworks.com/matla
tive values of rG and R parameters. These parameters allow bcentral/f ileexchange/87659-sand-cat-swarm-optimizati
on-algorithm
{
�������⃗ ����������⃗
⃗ + 1) =
X(t ( Posb (t) − Posrnd ⋅ cos (𝜃).⃗r ) |R| ≤ 1; exploitation (6)
��������bc⃗(t) − rand(0, 1) ⋅ Pos
⃗r ⋅ Pos �������⃗c (t) |R| > 1; exploration
13
Engineering with Computers
Prey Prey
Prey
(c) Performance in towards the last iterations (d) Performance in the last iteration
2.2.5 Computational complexity obtained and the analysis performed. These functions that
are used in this paper are divided into three groups: uni-
The computational complexity of calculation of the con- modal, multimodal, and fixed-dimension multimodal.
trol parameters defined is O (n × m), where n represents These functions are employed to examine the efficiency of
the population and m represents the size of the problem. the SCSO algorithm. The unimodal benchmark functions
Also, in the initialization phase, it requires O (n × m) time. just have one global optimum (maximum or minimum) and
Besides, the computational complexity of the agents' posi- no local optima. Multimodal functions have more than one
tion update is also O (n × m). Considering that n and m
equal, the general computational complexity of the pro-
posed algorithm is O(n2). In other words, the computational
complexity of the SCSO algorithm is O(n2); O (n × m); if
n = m. Indeed; T(n) = (n�+ 1)� + (m + 1) + c(n × m) + O(1) →
∑n ∑m
O(1) + I=1 j=1 c ≅ O n2
13
Engineering with Computers
13
Table 2 Benchmark functions (CEC14, 15)
Benchmark function Formula Dim Range f min Type
13
∑n
F1 Sphere f1 (x) = xi2 30 [− 100,100] 0 Unimodal
∑i=1
n � � ∏n
F2 Schwefel 2.22 f2 (x) = i=1 �xi � + i=1 ��xi �� 30 [− 10,10] 0 Unimodal
F3 Schwefel 1.2 �
∑n−1 ∑j<i �2 30 [− 100,100] 0 Unimodal
f3 (x) = i=0 x
j=0 i
{ }
F4 Schwefel 2.21 f4 (x) = maxi |xi , 1 ≤ i ≤ n 30 [− 100,100] 0 Unimodal
∑ � � �2 � �2 �
F5 Generalized Rosen- f5 (x) = n−1 30 [− 30,30] 0 Unimodal
i=1
100 xi+1 − xi2 + xi − 1
brock
F6 STEP ∑n �� ��2 30 [− 100,100] 0 Unimodal
f6 (x) = i=1 xi + 0.5
∑n
F7 Quartic f7 (x) = i=1 ixi4 + random[0, 1) 30 [− 1.28,1.28] 0 Unimodal
�� �
F8 Generalized Schwefel ∑ �xi � 30 [− 500,500] − 418.9829*5 Multimodal
f8 (x) = ni=1 −xisin � �
∑n � � � �
F9 Rastrigin f9 (x) = i=1
xi2 − 10𝑐𝑜𝑠 2𝜋xi + 10 30 [− 5.12, 5.12] 0 Multimodal
� �
n
F10 Ackley � ∑ 30 [− 32,32] 0 Multimodal
f10 (x) = −20exp −0.2 1n i=1 xi2
� ∑ � ��
−exp 1n ni=1 cos 2𝜋xi + 20 + e
� �
F11 Griewank 1 ∑n ∏ X 30 [− 600,600] 0 Multimodal
f11 (x) = 4000 x2 − ni=1 cos √i + 1
i=1 i i
�
F12 Generalized penalized f12 (x) = 𝜋 � � ∑ � �2 � � �� � �2 �
30 [− 50,50] 0 Multimodal
n
10sin 𝜋y1 + n−1
i=1 yi − 1 1 + 10sin2 𝜋y1+1 + yn − 1
∑n � �
+ i=1 u xi , 10, 100, 4
xi +1
yi = 1 + 4
⎧ � �m
� � ⎪ k xi − a xi > a
m
u xi , a, k, m = ⎨ � 0 � −a < xi < 1
⎪ k −xi − a xi < −a
⎩
� � � ∑ � �2 � � ��
F13 Generalized penalized f13 (x) = 0.1 sin2 3𝜋x1 + ni=1 xi − 1 1 + sin2 3𝜋xi + 1 30 [− 50,50] 0 Multimodal
� �2 � � ��� ∑n � �
+ xn − 1 1 + sin2 2𝜋xni + i=1 u xi , 5, 100, 4
Fixed-dimension
Fixed-dimension
Fixed-dimension
Fixed-dimension
multimodal functions are presented. The multimodal func-
tions have multiple local optima and one global optimum,
besides the local optima number can be increased with the
Type
results in F
12 and F13 functions. According to the results of
3
[− 2,2]
[0, 1]
[1,3]
a 30% rate (it found the best answer in six of the 20 func-
tions.). The working mechanism of the SCSO algorithm is
cosx1 + 10
�2 �
�
)2 (
� ∑
forms better than other algorithms. The SCSO found the bet-
)2 (
(
Formula
Hartman’s family
Goldstein price
F18
F19
F20
13
Engineering with Computers
13
Engineering with Computers
from a large sensitivity range to discover more possible solu- In addition to the analysis made, Table 8 presents the
tions and exploring the whole search area. As the iterations p-values calculated by the nonparametric Wilcoxon rank-
progress, by decreasing the value of the sensitivity ranges sum tests for the pair-wise comparison over two independent
search agents try to exploit and find the global optima. The samples (SCSO vs. CSO, GWO, WOA, SSA, GSA, PSO,
local optima avoidance in the SCSO algorithm is controlled BWO). The p-values are generated by the Wilcoxon test
by the R parameter. It is realized with a balance between with 0.05 significant level and over 30 independent runs.
exploration and exploitation phases. Figure 9 shows the con- In this table, plus notation (+) indicates the superiority of
vergence curve analysis of the SCSO algorithm in contrast the proposed algorithm, ~ indicates both algorithms obtain
to other algorithms used on some benchmark functions. In equal values, and mines notation (−) indicates the obtained
convergence behavior analysis focuses on the exploration solution of the proposed algorithm is worse than compared
and exploitation behavior of each algorithm. The SCSO algorithms.
algorithm has a good performance at the exploration phase
due to the mechanism used in the location updating of the
search agent. In addition, thanks to the attacking mechanism 4 SCSO in classical engineering problems
of the SCSO algorithm, it was possible for search agents to
exploit efficiently. In this section, some engineering problems, which have
Berg et al. [57] have mentioned that the unexpected some equality and inequality limitations, are applied by the
change in the movement of search agents of the initial steps SCSO and their results are compared with other algorithms.
of the optimization algorithm is required. These movement These problems are seven well-known constrained engineer-
changes give rise to explore the search space broadly, in the ing design problems [Welded beam design (WBD), com-
last steps, this movement should be decreased to emphasize pression spring design (CSD), pressure vessel design (PVD),
exploitation. Generally, the abrupt change in the movement piston lever problem (PL), speed reducer design (SRD),
of the search agents is caused the search agent to discover a three-bar truss design (BTD), and cantilever beam design
possible solution and exploit from them. It was observed that (CBD)]. Metaheuristic algorithms can be usually successful
the SCSO algorithm has a good performance in convergence in solving such problems. The SCSO algorithm is applied to
behavior. Regarding results according to the mechanism of each problem and the results are compared with other used
the SCSO algorithm, search agents discover areas in the metaheuristic algorithms (CSO, GWO, WOA, SSA, GSA,
early iterations and then try to exploit them after certain PSO, and BWO). The simulation assumptions are assumed
of the iterations. Herewith, the performance of the SCSO to be the same as in the previous section.
algorithm on the benchmark functions better than other used
metaheuristic algorithms.
13
Engineering with Computers
13
Engineering with Computers
Fig. 8 The working mechanism of the SCSO algorithm in defined iterations (sample case: Ackley function F10)
13
Engineering with Computers
( )
Minimize f (⃗x) = x3 + 2 x2 x12 , 4.3 Pressure vessel design problem
13
Engineering with Computers
13
Engineering with Computers
Fig. 9 The convergence curve analysis of each algorithm in some test functions
According to the simulation results, the proposed algo- 4.6 Three‑bar truss design problem
rithm outperformed better than other used algorithms in
finding the optimum cost. Their results are presented in This design considers a three-bar planar truss structure as
Table 13. shown in Fig. 15. The goal is to minimize relevant weights.
This case includes two optimization variables (A1 (= x1),
A2(= x2)) with three optimization constraints: stress, deflec-
tion, and buckling. This example is reported in [62] as a
13
Engineering with Computers
Table 9 Comparison results for welded beam design problem for all
algorithms
Algorithms Optimum variables Optimum cost
h l t b
Fig. 10 Design parameters of the welded beam design problem The best algorithm has been bold
√
2x1 + x2
highly constrained search space. The equation of this prob- Subject to: G1 = √ P − 𝜎 ≤ 0;
lem is given as follows: 2x12 + 2x1 x2
� √ � x2
G2 = √ P − 𝜎 ≤ 0;
Minimize: f (A1, A2) = l × 2 2x1 + x2 2x12 + 2x1 x2
1
G3 = √ P − 𝜎 ≤ 0;
2x2 + x1
13
Engineering with Computers
Table 10 Comparison results for tension/compression spring design Table 11 Comparison results for pressure vessel design problem for
problem for all algorithms all algorithms
Algorithms Optimum variables Optimum cost Algorithms Optimum variables Optimum cost
d D N Ts Th R L
SCSO 0.0500 0.3175 14.0200 1.2717020E−02 SCSO 0.7798 0.9390 40.3864 199.2918 5917.46
CSO 0.0671 0.8482 2.4074 1.6829585E−02 CSO 0.9953 0.4922 51.5106 87.1774 6389.35
GWO 0.0500 0.3174 14.0373 1.2727747E−02 GWO 0.8055 0.3992 41.7309 181.2937 5938.51
WOA 0.0554 0.4526 7.2886 1.2901922E−02 WOA 0.8093 1.2151 41.1110 189.2692 8497.55
SSA 0.0500 0.3122 14.7463 1.3069754E−02 SSA 0.9101 0.4499 47.1576 122.6549 6152.18
GSA 0.0606 0.2749 4.8674 1.7762975E−02 GSA 1.0921 14.1349 56.5865 71.8650 84,851.85
PSO 0.0500 0.3175 14.0373 1.2717021E−02 PSO 1.0206 0.5045 52.8803 77.0186 6442.20
BWO 0.0500 0.3122 14.7963 1.3109512E−02 BWO 4.0618 20.3225 58.7254 77.2508 159,345.5
The best algorithm has been bold The best algorithm has been bold
13
Engineering with Computers
61 37 19 7 1
Subject to: G(X) = 3
+ 3 + 3 + 3 + 3 − 1 ≤ 0.
x1 x1 x3 x4 x5
13
Engineering with Computers
The proposed algorithm was simulated on 30 benchmark The SCSO algorithm outperformed in 19 functions of a total
functions, also applied in seven engineering optimization of 30 test functions and ranked first. The BWO algorithm,
problems, and in all of them, the results were compared with which ranks second in these functions, took to find the best
seven different well-known metaheuristic algorithms (CSO, solution in only seven functions in total. This displays the
GWO, WOA, SSA, GSA, PSO, and BWO) and analyzed. superior success rate of the SCSO algorithm. Moreover, the
According to the results, SCSO performed more successfully SCSO algorithm achieved the best results in all seven engi-
and achieved the best results overall among all 8 algorithms. neering problems focused on.
13
Engineering with Computers
Table 15 Comparison results for cantilever beam design problem for all algorithms
Algorithms Optimum variables Optimum cost
x1 x2 x3 x4 x5
13
Engineering with Computers
Supplementary Information The online version contains supplemen- 20. Hatamlou A (2013) Black hole: a new heuristic optimization
tary material available at https://d oi.o rg/1 0.1 007/s 00366-0 22-0 1604-x. approach for data clustering. Inf Sci 222:175–184
21. Kaveh A, Talatahari S (2010) A novel heuristic optimization
Author contributions AS conceptualization, investigation, methodol- method: charged system search. Acta Mech 213(3):267–289
ogy, software, validation, formal analysis, original draft, writing— 22. Moghaddam FF, Moghaddam RF, Cheriet M (2012) Curved space
review and editing. FK conceptualization, supervision, project admin- optimization: a random search based on general relativity theory.
istration, methodology, writing—review and editing. http://arxiv.org/abs/1208.2214
23. Formato RA (2007) Central force optimization: a new metaheuris-
tic with applications in applied electromagnetics. Prog Electromag
Declarations Res 77:425–491
24. Shah-Hosseini H (2011) Principal components analysis by the
Conflict of interest The authors declare no conflict of interest. galaxy-based search algorithm: a novel metaheuristic for continu-
ous optimisation. Int J Comput Sci Eng 6(1–2):132–140
25. Kennedy J, Eberhart R (1995) Particle swarm optimization. In:
Proceedings of ICNN'95-international conference on neural net-
References works, vol 4. IEEE, pp 1942–1948
26. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization.
1. Jamil M, Xin-She Y (2013) A literature survey of benchmark IEEE Comput Intell Mag 1(4):28–39
functions for global optimization problems. http://arxiv.org/abs/ 27. Okdem S, Karaboga D (2009) Routing in wireless sensor net-
1308.4008 works using an ant colony optimization (ACO) router chip. Sen-
2. Talbi EG (2009) Metaheuristics: from design to implementation, sors 9(2):909–921
vol 74. Wiley, New York, pp 5–39 28. Seyyedabbasi A, Kiani F (2020) MAP-ACO: an efficient protocol
3. Tang C, Zhou Y, Tang Z et al (2021) Teaching-learning-based for multi-agent pathfinding in real-time WSN and decentralized
pathfinder algorithm for function and engineering optimization IoT systems. Microprocess Microsyst 79(103325):1–9
problems. Appl Intell 51:5040–5066 29. Karaboga D, Basturk B (2007) Artificial bee colony (ABC) opti-
4. Wolpert DH, Macready WG et al (1997) No free lunch theorems mization algorithm for solving constrained optimization problems.
for optimization. IEEE Trans Evol Comput 1(1):67–82 International fuzzy systems association world congress. Springer,
5. Kiani F, Seyyedabbasi A, Nematzadeh S (2021) Improving the Berlin, pp 789–798
performance of hierarchical wireless sensor networks using the 30. Yang XS (2010) A new metaheuristic bat-inspired algorithm.
metaheuristic algorithms: efficient cluster head selection. Sens Nature inspired cooperative strategies for optimization (NICSO
Rev 1–14 2010). Springer, Berlin, pp 65–74
6. Kaveh A (2017) Applications of metaheuristic optimization algo- 31. Yang XS (2009) Firefly algorithms for multimodal optimization.
rithms in civil engineering. Springer International Publishing, International symposium on stochastic algorithms. Springer, Ber-
Basel. https://doi.org/10.1007/978-3-319-48012-1 lin, pp 169–178
7. Kiani F, Seyyedabbasi A, Mahouti P (2021) Optimal characteriza- 32. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer.
tion of a microwave transistor using grey wolf algorithms. Analog Adv Eng Softw 69:46–61
Integr Circ Sig Process 109:599–609 33. Seyyedabbasi A, Kiani F (2021) I-GWO and Ex-GWO: improved
8. Can U, Alatas B (2015) Physics based metaheuristic algorithms algorithms of the grey wolf optimizer to solve global optimization
for global optimization. Am J Inf Sci Comput Eng 1(3):94–106 problems. Eng Comput 37:509–532
9. Back T (1996) Evolutionary algorithms in theory and practice: 34. Mirjalili S, Lewis A (2016) The whale optimization algorithm.
evolution strategies, evolutionary programming, genetic algo- Adv Eng Softw 95:51–67
rithms. Oxford University Press, Oxford 35. Mirjalili S (2016) Dragonfly algorithm: a new metaheuristic
10. Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–73 optimization technique for solving single objective, discrete, and
11. Storn R, Price K (1997) Differential evolution—a simple and effi- multi-objective problems. Neural Comput Appl 27(4):1053–1073
cient heuristic for global optimization over continuous spaces. J 36. Yang XS, Deb S (2009) Cuckoo search via Lévy flights. In: 2009
Glob Optim 11(4):341–359 world congress on nature & biologically inspired computing
12. Cai X, Zhao H, Shang Sh, Zhou Y et al (2021) An improved (NaBIC). IEEE, pp 210–214
quantum-inspired cooperative co-evolution algorithm with muli- 37. Arora S, Singh S (2019) Butterfly optimization algorithm: a novel
strategy and its application. Expert Syst Appl 121:1–13 approach for global optimization. Soft Comput 23(3):715–734
13. Glover F (1990) Tabu search: a tutorial. Inf J Appl Anal 38. Bayraktar Z, Komurcu M, Werner DH (2010) Wind driven opti-
20(4):75–94 mization (WDO): a novel nature-inspired optimization algorithm
14. Yao X, Liu Y, Lin G (1999) Evolutionary programming made and its application to electromagnetics. In: IEEE Antennas and
faster. IEEE Trans Evol Comput 3(2):82–102 propagation society Internation symposium (APSURSI), pp 1–4
15. Simon D (2008) Biogeography-based optimization. IEEE Trans 39. Chu SC, Tsai PW, Pan JS (2006) Cat swarm optimization. In:
Evol Comput 12(6):702–713 Pacific Rim international conference on artificial intelligence.
16. Hayyolalam V, Kazem AAP (2020) Black widow optimization Springer, Berlin, pp 854–858
algorithm: a novel metaheuristic approach for solving engineering 40. Pan WT (2012) A new fruit fly optimization algorithm: taking
optimization problems. Eng Appl Artif Intell 87(103249):1–28 the financial distress model as an example. Knowl Based Syst
17. Webster B, Bernhard PJ (2003) A local search optimization algo- 26:69–74
rithm based on natural principles of gravitation. Florida Institute 41. Yang XS (2012) Flower pollination algorithm for global optimiza-
of Technology, Technical Reports, pp 1–19 tion. In: Unconventional computation and natural computation,
18. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravi- lecture notes in computer science, vol 7445, pp 240–249
tational search algorithm. Inf Sci 179(13):2232–2248 42. Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mir-
19. Erol OK, Eksin I (2006) A new optimization method: big bang– jalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer
big crunch. Adv Eng Softw 37(2):106–111 for engineering design problems. Adv Eng Softw 114:163–191
13
Engineering with Computers
43. Tang C, Zhou Y, Luo Q et al (2021) An enhanced pathfinder 53. Yao X, Liu Y, Lin G (1999) Evolutionary programming made
algorithm for engineering optimization problems. Eng Comput. faster. Evolut Comput IEEE Trans 3:82–102
https://doi.org/10.1007/s00366-021-01286-x 54. Seyyedabbasi A, Aliyev R, Kiani F, Gulle M, Basyildiz H, Shah
44. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H M (2021) Hybrid algorithms based on combining reinforcement
(2019) Harris hawks optimization: algorithm and applications. learning and metaheuristic methods to solve global optimization
Future Gener Comput Syst 97:849–872 problems. Knowl Based Syst 223:1–22
45. Zhong L, Zhou Y, Luo Q, Zhong K (2021) Wind driven dragonfly 55. Molga M, Smutnicki C (2005) Test functions for optimization
algorithm for global optimization. Concurr Comput Pract Exp needs
33(6):e6054 56. Jamil M, Yang X (2013) A literature survey of benchmark func-
46. Wang Z, Luo Q, Zhou Y (2021) Hybrid metaheuristic algo- tions for global optimization problems. Int J Math Model Numer
rithm using butterfly and flower pollination base on mutual- Optim 4(2):1–47
ism mechanism for global optimization problems. Eng Comput 57. Van den Bergh F, Engelbrecht AP (2006) A study of particle
37:3665–3698 swarm optimization particle trajectories. Inf Sci 176(8):937–971
47. Cole FR, Wilson DE (2015) Felis margarita (Carnivora: Felidae). 58. Coello CAC (2000) Use of a self-adaptive penalty approach for
Mamm Species 47(924):63–77 engineering optimization problems. Comput Ind 41(2):113–127
48. Huang G, Rosowski J, Ravicz M, Peake W (2002) Mammalian 59. Chattopadhyay S (2004) Pressure vessels: design and practice, 1st
ear specializations in arid habitats: structural and functional edn. CRC Press, Boca Raton. https://d oi.o rg/1 0.1 201/9 78020 3492
evidence from sand cat (Felis margarita). J Comp Physiol A 468
188(9):663–681 60. Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search algo-
49. Abbadi M (1989) Radiotelemetric observations on sand cats (Felis rithm: a metaheuristic approach to solve structural optimization
margarita) in the Arava Valley. Isr J Zool 36:155–156 problems. Eng Comput 29:17–35
50. Liang JJ, Qu BY, Suganthan PN (2013) Problem definitions and 61. Bayzidi H, Talatahari S, Saraee M, Lamarche CP (2021) Social
evaluation criteria for the CEC 2014 special session and competi- network search for solving engineering optimization problems.
tion on single objective real-parameter numerical optimization, Comput Intell Neurosci
vol 635. Computational Intelligence Laboratory, Zhengzhou Uni- 62. Nowcki H (1974) Optimization in pre-contract ship design. In:
versity, Zhengzhou China and Technical Report, Nanyang Tech- Fujita Y, Lind K, Williams TJ (eds) Computer applications in the
nological University, Singapore, pp 1–32 automation of shipyard operation and ship design, vol 2. North
51. Liang JJ, Qu BY, Suganthan PN, Chen Q (2014) Problem defini- Holland, Elsevier, New York, pp 327–338
tions and evaluation criteria for the CEC 2015 competition on 63. Chickermane H, Gea HC (1996) Structural optimization using
learning-based real-parameter single objective optimization, a new local approximation method. Int J Numer Methods Eng
vol 29. Technical Report201411A. Computational Intelligence 39(5):829–846
Laboratory, Zhengzhou University, Zhengzhou China and Tech-
nical Report, Nanyang Technological University, Singapore, pp Publisher's Note Springer Nature remains neutral with regard to
625–640 jurisdictional claims in published maps and institutional affiliations.
52. Price KV, Awad NH, Ali MZ, Suganthan PN (2018) The 100-digit
challenge: problem definitions and evaluation criteria for the 100-
digit challenge special session and competition on single objective
numerical optimization. Nanyang Technological University
13