Applied Mathematics and Computation 218 (2012) 6095–6117
Contents lists available at SciVerse ScienceDirect
Applied Mathematics and Computation
journal homepage: www.elsevier.com/locate/amc
Novel selection schemes for harmony search
Mohammed Azmi Al-Betar a,b,⇑, Iyad Abu Doush c, Ahamad Tajudin Khader a,
Mohammed A. Awadallah a
a
School of Computer Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia
Department of Computer Science, Al-zaytoonah University of Jordan, Amman, Jordan
c
Computer Science Department, Yarmouk University, Irbid, Jordan
b
a r t i c l e
i n f o
Keywords:
Harmony search algorithm
Selection schemes
Evolutionary Algorithm
Selective pressure
Memory consideration
Genetic Algorithm
a b s t r a c t
Selection is a vital component used in Evolutionary Algorithms (EA) where the fitness value
of the solution has influence on the evolution process. Normally, any efficient selection
method makes use of the Darwinian principle of natural selection (i.e., survival of the fittest). Harmony search (HS) is a recent EA inspired by musical improvisation process to seek
a pleasing harmony. Originally, two selection methods are used in HS: (i) memory consideration selection method where the values of the decision variables are randomly selected
from the population (or solutions stored in harmony memory (HM)) to generate a new harmony, and (ii) selecting a new solution in HM whereby a greedy selection is used to update
the HM. The memory consideration selection, the focal point of this paper, is not based on
natural selection principle which draws heavily on random selection. In this paper, novel
selection schemes which replace the random selection scheme in memory consideration
are investigated, comprising global-best, fitness-proportional, tournament, linear rank
and exponential rank. The proposed selection schemes are individually altered and incorporated in the process of memory consideration and each adoption is realized as a new HS
variation. The performance of the proposed HS variations are evaluated and a comparative
study is conducted. The experimental results using benchmark functions show that the
selection schemes incorporated in memory consideration directly affect the performance
of HS algorithm. Finally, a parameter sensitivity analysis of the proposed HS variations is
analyzed.
Ó 2011 Elsevier Inc. All rights reserved.
1. Introduction
Harmony search (HS) algorithm, a population-based metaheuristic algorithm, which was proposed by Geem et al. [1] has
nowadays attracted the attention of the optimization research community due to its impressive advantages: it stipulates
fewer mathematical requirements and iteratively generates a new solution after considering all the existing solutions [2];
it has a novel stochastic derivative which reduces the number of iterations required to converge towards local minima
[3]; it can handle both discrete and continuous variables. In other words, such advantages are related to simplicity, flexibility, adaptability, generality, and scalability [4]. HS algorithm has been successfully tailored for several optimization problems
such as structural optimization, multi-buyer multi-vendor supply chain problem, timetabling, flow shop scheduling [5–11],
and many others as shown in [12]. The basic structure and performance of HS algorithm have been improved overtime to
keep pace with the requirements of the applications being developed [13,14]. This is done by tuning HS parameters
⇑ Corresponding author at: School of Computer Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia.
E-mail address: mohbetar@cs.usm.my (M.A. Al-Betar).
0096-3003/$ - see front matter Ó 2011 Elsevier Inc. All rights reserved.
doi:10.1016/j.amc.2011.11.095
6096
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
[2,15–17] and/or hybridizing it with characteristics of other effective optimization methods [18]. Theoretically, mathematical analysis of the exploratory power capability of HS algorithm has been investigated [19].
As HS is an Evolutionary Algorithm (EA) [18], it begins with a set of provisional solutions stored in harmony memory
(HM). At each evolution, a new solution called new harmony is generated based on three operators: (i) memory consideration, which selects the variable values of new harmony from HM solutions; (ii) random consideration, used to diversify
the new harmony, and (iii) pitch adjustment which is responsible for local improvement. The new harmony is then evaluated
to replace the worst solution in HM, if it is better. The solutions in HM will evolve iteratively in the hope of obtaining a better
solutions in the next evolutions. This process is looped until a stop criterion is satisfied.
In the memory consideration operator, selection is the process of choosing the values of the variables from the solutions
stored in HM that will be used to generate the new harmony. In EA, selection is an artificial process which mimics the natural
selection (i.e., survival of the fittest). Selection is the most influential element used in EA where the fitness value of the solutions has a great impact on the evolution process. In fact, the fitness value in EA has essential features for individuals: it is
clearly defined, direct, and valuable which logically makes it an active fitness-based component [20]. Basically, the characteristic of being more or less directed toward specific (i.e., best) individuals is the common property of implementing the
selection scheme in EA [20].
The memory consideration operator in HS algorithm originally selects the value of the variables randomly from any solution, yet it gives less consideration to the natural selection principle where the fittest solutions should have higher probability to give a value of variables to generate the new harmony. In EA, several selection schemes, which imitate the
natural selection principle, have been proposed to guide the search process. This paper investigates selection schemes for
memory consideration to emulate Darwin’s principle of the ‘survival of the fittest’. Along with the original random selection
scheme proposed by Geem et al. [1], five others were investigated: global-best, proportional, tournament, linear ranking, and
exponential ranking selection schemes. These selection schemes are borrowed from other powerful EAs (i.e., Particle Swarm
Optimization (PSO) [21,22] and Genetic Algorithm (GA) [23]) and altered to be applicable for HS algorithm. Using standard
benchmark functions, the experimental results show that the selection schemes incorporated with memory consideration
have a high impact on the performance of HS algorithm.
The remainder of this paper is organized as follows: HS algorithm is overviewed in Section 2. The novel selection schemes
incorporated with memory consideration are discussed in Section 3. Results of the experiments are presented in Section 4.
Finally, Section 5 concludes the paper and gives suggestions for possible future directions.
2. Harmony search algorithm principles
HS is an Evolutionary Algorithm (EA) inspired by musical improvisation process [1], where a group of musicians improvise the pitches of their musical instruments, practice after practice, seeking for a pleasing harmony as determined by an
audio-aesthetic standard. Analogously in optimization, a set of decision variables is assigned with values, iteration by iteration, seeking for a ‘good enough’ solution as evaluated by an objective function. HS has five main steps illustrated as a flowchart in Fig. 1 and described as follows:
Step 1: Initialize the problem and HS parameters. Normally, the optimization problem is initially modeled as:
min {f(x)jx 2 X}, where f(x) is the objective function; x = {xiji = 1, . . . , N} is the set of decision variables.
X = {Xiji = 1, . . . , N} is the possible value range for each decision variable, where Xi 2 [LBi, UBi], where LBi and UBi
are the lower and upper bounds for the decision variable xi respectively and N is the number of decision variables.
The parameters of the HS algorithm required to solve the optimization problem are also specified in this step:
Fig. 1. The flowchart of the HS algorithm.
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
6097
(a) The Harmony Memory Consideration Rate (HMCR), used in the improvisation process to determine whether the
value of a decision variable is to be selected from the solutions stored in the harmony memory (HM).
(b) The Harmony Memory Size (HMS) is similar to the population size in Genetic Algorithm.
(c) The Pitch Adjustment Rate (PAR), decides whether the decision variables are to be adjusted to a neighbouring
value.
(d) The distance bandwidth (BW), also known as fret width (FW) [13], determines the distance of adjustment in the
pitch adjustment operator.
(e) The Number of Improvisations (NI) corresponds to the number of iterations.
These parameters will be explained in more detail in the next steps. Note that the HMCR and PAR are the two parameters
which control the three operators of HS algorithm (i.e., (i) memory consideration is controlled by HMCR, (ii) random consideration is controlled by 1-HMCR, and (iii) pitch adjustment is controlled by PAR).
Step 2: Initialize the harmony memory. The harmony memory (HM) is an augmented matrix of size N HMS which contains sets of solution vectors determined by HMS (see (1)). In this step, these vectors are randomly generated as follows: xji ¼ LBi þ ðUBi LBi Þ Uð0; 1Þ; 8i ¼ 1; 2; . . . ; N and "j = 1, 2, . . . , HMS, and U(0,1) generate a uniform random
number between 0 and 1. The generated solutions are stored in HM in ascending order according to their objective
function values.
2
x11
6
6 2
6 x1
6
6
HM ¼ 6
6 .
6 ..
6
4
xHMS
1
x12
x22
..
.
xHMS
2
x1N
3
7
7
x2N 7
7
7
7:
.. 7
. 7
7
5
ð1Þ
xHMS
N
Step 3: Improvise a new harmony. In this step, the HS algorithm will generate (improvise) a new harmony vector from
scratch, x0 ¼ ðx01 ; x02 ; . . . ; x0N Þ, based on three operators: (1) memory consideration, (2) random consideration, and
(3) pitch adjustment.
Memory consideration. In memory consideration, the value of the first decision variable x01 is randomly selected from
the historical values, fx11 ; x21 ; . . . ; xHMS
g, stored in HM vectors. Values of the other decision variables, ðx02 ; x03 ; . . . ; x0N Þ, are
1
sequentially selected in the same manner with probability (w.p.) HMCR where HMCR 2 (0, 1). It is worth noting that
the selection scheme in memory consideration is random and that the natural selection principle is not used (i.e., the
value of decision variable is selected from any solution using unguided selection scheme).
Random consideration. Decision variables that are not assigned with values according to memory consideration are
randomly assigned according to their possible range by random consideration with a probability of (1-HMCR) as
follows:
x0i
8 0
g w:p:
< xi 2 fx1i ; x2i ; . . . ; xHMS
i
:
x0i 2 X i
HMCR;
w:p:
1 HMCR:
Pitch adjustment. Each decision variable x0i of a new harmony vector, x0 ¼ ðx01 ; x02 ; x03 ; . . . ; x0N Þ, that has been assigned a value by
memory considerations is pitch adjusted with the probability of PAR where PAR 2 (0, 1) as follows
Pitch adjusting decision for
x0i
(
Yes w:p:
PAR;
No
1 PAR:
w:p:
If the pitch adjustment decision for x0i is Yes, the value of x0i is modified to its neighboring value as follows:
x0i ¼ x0i þ Uð1; 1Þ FW.
Step 4: Update the harmony memory. If the new harmony vector, x0 ¼ ðx01 ; x02 ; ; x0N Þ, is better than the worst harmony
xworst stored in HM in terms of the objective function value (i.e., xworst = xHMS in case HM is sorted), the new harmony vector is included to the HM, and the worst harmony vector is excluded from the HM. This is a greedy selection scheme where the principle of natural selection is applied.
Step 5: Check the stop criterion. Step 3 and step 4 of HS algorithm are repeated until the stop criterion (maximum number
of improvisations) is met. This is specified by NI parameter.
6098
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
The procedure of HS algorithm can be presented as in Algorithm 1:
Algorithm 1. Harmony search algorithm
Set HMCR, PAR, NI, HMS, FW.
xji ¼ LBi þ ðUBi LBi Þ Uð0; 1Þ; 8i ¼ 1; 2; . . . ; N and "j = 1, 2, . . . , HMS {generate HM solutions}
Calculate (f(xj)), "j = (1, 2, . . . , HMS)
Sort(HM)
itr = 0
while (itr 6 NI) do
x0 = /
for i = 1, . . . , N do
if (U(0, 1) 6 HMCR) then
g {memory consideration}
x0i 2 fx1i ; x2i ; . . . ; xHMS
i
if (U(0,1) 6 PAR) then
x0i ¼ x0i þ Uð1; 1Þ FW {pitch adjustment}
end if
else
x0i ¼ LBi þ ðUBi LBi Þ Uð0; 1Þ {random consideration}
end if
end for
if (f(x0 ) < f(xworst)) then
Include x0 to the HM.
Exclude xworst from HM.
end if
itr = itr + 1
end while
3. Selection schemes
In HS algorithm, there are two places where the selection process has to be triggered:
(i) In memory consideration, where the HS selects the value of each decision variable randomly from its corresponding values stored in HM solutions with a probability HMCR.
(ii) In step 4: update the HM, where the HS uses a greedy selection to replace the new improvised harmony with the worst
harmony stored in HM.
The selection process used in memory consideration is the focal point of this paper. In the original HS algorithm, this
selection scheme selects a random solution from HM by an unguided process, yet it minimizes dependence on Darwinian’s
selection principle of ‘survival of the fittest’.
Generally speaking, the selection scheme in EA directs the search into considering better individuals and enforces a high
diversity of population [24]. The selection scheme has to maintain the diversity of the population so as not to fall into a premature convergence [25]. Selection is the force that provides EA with the convergence level. Selecting many solutions from
the memory will lead to a premature convergence, while selecting a small number of solutions will make the algorithm progress slower [26].
In EAs, any selection scheme has two phases: selection and sampling [27]. In the selection phase, each solution in the
population is assigned with selection probability based on its fitness value. In the sampling phase, the solutions of the
next population are sampled based on the selection probability. Selection schemes are classified into static and dynamic
selection [20]. In static selection scheme, the selection probability of each solution is determined in advance and then
remains constant during the search. Examples of this scheme include tournament selection [28], linear rank [27], and
exponential rank [26]. In contrast, the dynamic selection scheme updates the selection probability of each solution in
the population at each evolution. Another classification categorizes the selection schemes into fitness-proportionate
and rank-based [26].The fitness-proportionate class calculates the selection probability based on the absolute fitness
value of each individual while in rank-based class, the selection probability is determined based on fitness ranking rather
than absolute fitness. A simple example of fitness-proportionate is the traditional proportional selection scheme [23].
Some selection schemes have scaling problems which can lead to a premature convergence (e.g., proportionate
selection). Other selection schemes suffer from the non balance between fitness and the ability of reproduction (e.g.,
linear rank) [26].
6099
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Conventionally, the selective pressure, an informal term characterizing the strength of a selection scheme, is linked to the
process of evaluating any selection scheme. In HS algorithm, the selective pressure can be defined as the probability of
selecting the values of the variables from the better solution compared to the average probability of selection of all solutions
in HM. In general, the main aim of the selective pressure is to focus on regions of the search space known to have better
solutions. There is a tendency to use a higher selective pressure to result in an efficient selection method [29]. However,
the selective pressure has a direct influence on the algorithm diversity. Higher selective pressure leads to higher exploitation
and less exploration, thus leading to a premature convergence.
In this paper, six selection schemes incorporated in the memory consideration are presented including the original random selection mechanism. The selection schemes proposed are altered in a way that are applicable for HS. These selection
schemes were adopted in the memory consideration phase in a way that selects a variable from a solution in the harmony
memory. The selection is performed according to the HMCR and it is used to choose one variable from the solutions in the
harmony memory.
The discussion will take into consideration how each selection method calculates the selection probability and the way of
sampling (or assigning) the values of each decision variable in the new harmony. The selection schemes are illustrated in the
following subsections where the following assumptions are made to standardize the terms used:
(i) The solutions, [x1, x2, . . . , xHMS]T, stored in HM of size HMS are ordered in accordance with their fitness values where, x1
is the best solution (i.e., f(x1) < f(xi), "i = (2, . . . , HMS)) and xHMS is the worst.
(ii) The selection probability of the solutions, (x1, x2, . . . , xHMS), is pointed as (p1, p2, . . . , pHMS) where pi is the selection probability of the solution i, "i 2 (1, 2, . . . , HMS).
(iii) The selective pressure of any solution is determined by its selection probability, where a solution with a higher selection probability is expected to have a higher selective pressure.
P
(iv) For any selection scheme, the accumulative selection probability is unity (i.e., HMS
i¼1 pi ¼ 1).
0
(v) The value of each decision variable xi ; 8i 2 ð1; 2; . . . ; NÞ in the new harmony (to be mentioned later) meets the probability of HMCR.
At the end of each subsection which defines a selection scheme described below, we introduce a simple numerical example to clarify how the selection method is used in HS algorithm. We used the Sphere function (defined in Table 7) for all the
examples. In all examples, HMS = 5, HMCR = 0.9, and PAR = 0.3.
3.1. Random selection scheme
The random selection scheme as defined by Geem et al. [1] works as follows:
(i) Selection probability: all solutions stored in the HM have equal selection probability to select the value of any decision variable, i.e., {p1 = p2 = = pHMS}.
(ii) Sampling method: the value of the decision variable x0i is randomly sampled from any solution stored in HM, such as
x0i ¼ xki , where k U(1, 2, . . . , HMS).
Note that the selective pressure is equal for all the solutions stored in HM.
Example 3.1. In this selection scheme, all the values stored in the HM have the same chance to be selected (see Table 1). The
probability of selecting any solution from the harmony memory is the same as this selection scheme will randomly select
any of the solutions.
3.2. Global-best selection scheme
The global best is a primary concept of Particle Swarm Optimization (PSO) [21,22] which was used by Omran and Mahdavi [18] in pitch adjustment with a successful performance. The global best selection scheme modifies the random selection
Table 1
Example of random selection scheme.
Rank(i)
f(xi)
pi
1
2
3
4
5
1.2
3.7
7.6
12.8
17.7
0.2
0.2
0.2
0.2
0.2
Total
1.0
6100
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 2
Example of global-best selection scheme.
Rank(i)
f(xi)
pi
1
2
3
4
5
1.2
3.7
7.6
12.8
17.7
1.0
0
0
0
0
Total
1.0
scheme in memory consideration so that the memory consideration can select the values of the new harmony from the best
solution in the HM. Global-best selection scheme works as follows:
(i) Selection probability: the selection probability of the best solution is unity (i.e., p1 = 1) while the selection probability
of the remaining solutions is zero (i.e., pi = 0, "i 2 (2, . . . , HMS)).
(ii) Sampling method: the value of the decision variable x0i is sampled from the best harmony stored in the HM such as
x0i ¼ xjk , where j = arg minj2[1,HMS] f(xj) ^ k 2 (1, 2, . . . , N).
This selection scheme focuses on the single best solution in HM and this gives it a superior selective pressure.
Example 3.2. In this selection scheme the best solution in the harmony memory will be selected (see Table 2). The
probability of selecting any solution from the harmony memory other than the best solution is zero.
3.3. Proportional selection scheme
The Proportional (or Roulette wheel) selection scheme is the most traditional selection method proposed by Holand [23].
In this method, the selection probability depends on the absolute fitness value of any solution compared to the absolute fitness values of the other solutions stored in HM. Proportional selection scheme works as follows:
(i) Selection probability: the selection probability pi for the solution i is proportional to its fitness value1 based on:
f ðxi Þ
pi ¼ PHMS
:
j
j¼1 f ðx Þ
ð2Þ
(ii) Sampling method: the value of the decision variable x0i is sampled from the solution k in the HM, such as x0i ¼ xki ,
where k is chosen using the roulette wheel method shown in Algorithm 2:
Algorithm 2. Pseudocode for the roulette wheel method
1: Set r U(0, 1).
2: Set found = False
3: Set sum_prob = 0.
4: Set k = 0.
5: while (i 6 HMS) AND NOT (found) do
6: sum_prob = sum_prob + pi
7: if (sum_prob P r) then
8:
k=i
9:
found = True
10: end if
11: i = i + 1
12: end while
13: return (k)
In Algorithm 2, r randomly picks a value uniformly from U(0, 1); sum_prob has accumulative selection probabilities where
P
the sum prob ¼ ji¼1 pi is the accumulative selection probability of solution xj. Note that the accumulative selection probability of solution xHMS is unity.
1
The selection probability is proportional to the fitness value in the case of maximization objective function. However, in case of minimization, the selection
probability is proportional to the value (1/fitness value).
6101
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
It should be borne in mind that this selection scheme is dynamic as the selection probability of each solution in HM is reevaluated at each iteration [20]. There are some shortcomings in the implementation of this selection method [30]:
(i) The outstanding solution in HM normally has a higher selection probability and thus higher selective pressure which
leads to premature convergence due to the diversity loss.
(ii) Once the solutions stored in HM have similar fitness values, the selective pressure becomes almost equal for all solutions and thus the memory consideration serves the unguided selection process.
(iii) Once the fitness values of the HM solutions are transposed, this selection scheme behaves differently.
Example 3.3. In this selection scheme, the roulette wheel method is used to select one of the solutions in the harmony
memory. Table 3 shows the selection probability of the HM solutions. The solution with a better fitness value (i.e., minimum
one in our example) has the highest probability to be selected (see Fig. 2).
3.4. Tournament selection scheme
The tournament selection scheme is initially proposed by Goldberg in [28]. It randomly samples k solutions from the entire solutions stored in the HM, and then selects the solution with the best fitness from that tournament. Tournament selection scheme works as follows:
(i) Selection probability: the size of tournament is k, the solutions in the k-tournament are randomly selected. The selection probability of each solution in the k-tournament is determined as follows:
pi ¼
1
k
HMS
ðHMS i þ 1Þk ðHMS iÞk :
ð3Þ
Eq. (3) was proved in [20], p.173.
(ii) Sampling method: the value of the decision variable x0i is sampled from the solution xj where j is the index of the best
solution in the k-tournament.
Table 3
Example of proportional selection scheme.
Rank(i)
f(xi)
1/f(xi)
1
2
3
4
5
1.2
3.7
7.6
12.8
17.7
0.833
0.270
0.132
0.078
0.056
Total
pi
sum_probi
0.6085
0.1972
0.0964
0.0570
0.0365
1.369
0.6085
0.8057
0.9021
0.9591
1.0
1.0
Fig. 2. The selection process of roulette wheel method used for Example 3.3.
Table 4
Example of tournament selection scheme.
Rank(i)
f(xi)
Tournament elements
pi
1
2
3
4
5
1.2
3.7
7.6
12.8
17.7
X
U
X
U
U
0
1.0
0
0
0
Total
1.0
6102
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
This mechanism has certain advantages over other selection schemes: It does not require the ranking of the whole population first and its selective pressure, which is tuned based on the size of tournament (i.e., k), can be adjusted easily [31].
Example 3.4. In this selection scheme, assume we have a tournament size k = 3. Then three solutions from the HM are
randomly selected to enter the tournament (e.g., x2, x4, x5). Table 4 shows the selection probability of the HM solutions in the
tournament. The HS algorithm will select the best solution from the solutions participated in the tournament (i.e., x2).
3.5. Linear ranking selection scheme
Linear ranking is another selection scheme that was inspired to cover the drawbacks of the proportional selection scheme
[27]. The rationale behind the rank selection schemes is to determine the selection probability of the solutions stored in HM
according to the solution fitness rank. This selection scheme is static [20], where the selection probability is determined in
advance and remains constant during the search process. Linear ranking selection scheme works as follows:
(i) Selection probability: Considering the assumption above, Let g+ = HMS p1 and g = HMS pHMS where g+ is the
expected values of the best solutions in HM while g is the expected value of the worst. Both g+ and g determine
the slope of the linear function. The ranked selection probability of the solution xi is determined by linear mapping
as follows [31]:
Table 5
Example of linear ranking selection scheme.
Rank(i)
f(xi)
1
1.2
2
3.7
3
7.6
4
12.8
5
17.7
1
i1
gþ ðgþ g Þ HMS1
pi ¼ HMS
pi ¼ 15 1:8 ð1:8 0:2Þ 11
51
1
21
pi ¼ 5 1:8 ð1:8 0:2Þ 51
pi ¼ 15 1:8 ð1:8 0:2Þ 31
51
1
41
pi ¼ 5 1:8 ð1:8 0:2Þ 51
pi ¼ 15 1:8 ð1:8 0:2Þ 51
51
Total
pi
sum_probi
0.36
0.36
0.28
0.64
0.2
0.84
0.12
0.96
0.04
1.0
1.0
Fig. 3. The selection process of roulette wheel method used for Example 3.5.
Table 6
Example of exponential ranking selection scheme.
Rank(i)
f(xi)
ci
pi
sum_probi
1
2
3
4
5
1.2
3.7
7.6
12.8
17.7
s=1
s = 0.9
s = 0.92 = 0.81
s = 0.93 = 0.729
s = 0.94 = 0.6561
0.244
0.22
0.198
0.178
0.16
0.244
0.464
0.662
0.84
1.0
4.0951
1.0
Total
Fig. 4. The selection process of roulette wheel method used for Example 3.6.
Function name
Sphere function
Schwefel’s problem 2.22
[34]
Step function
Rosenbrock function
Rotated hyper-ellipsoid
function
Schwefel’s problem 2.26
[34]
Rastrigin function
Ackley’s function
Griewank function
Six-Hump Camel-Back
function
Shifted Sphere function
[35]
Shifted Schwefel’s problem
1.2 [35]
Shifted Rosenbrock [35]
Shifted Rastrigin [35]
Expression
P
2
f1 ðxÞ ¼ N
i¼1 xi
P
QN
f2 ðxÞ ¼ N
jx
i¼1 i j þ
i¼1 jxi j
f3 ðxÞ ¼
PN
i¼1 ðbxi
þ 0:5cÞ2
2
2 2
i¼1 ð100ðxiþ1 xi Þ þ ðxi 1Þ Þ
2
PN Pi
f5 ðxÞ ¼ i¼1
j¼1 xj
pffiffiffiffiffi
PN
f6 ðxÞ ¼ i¼1 xi sinð jxi jÞ
f4 ðxÞ ¼
PN1
PN
2
i¼1 ðxi
10 cosð2pxi Þ þ 10Þ
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
P
PN 2ffi
N
1
1
f8 ðxÞ ¼ 20 exp 0:2 30
exp 30
i¼1 xi
i¼1 cosð2pxi Þ þ 20 þ e
PN 2 QN
xiffi
1
p
f9 ðxÞ ¼ 4000
þ1
i¼1 xi
i¼1 cos
i
f7 ðxÞ ¼
f10 ðxÞ ¼ 4x21 2:1x41 þ 13 x61 þ x1 x2 4x22 þ 4x42
f11 ðxÞ ¼
f12 ðxÞ ¼
f13 ðxÞ ¼
f14 ðxÞ ¼
PN
2
i¼1 zi
PN
i¼1
PN1
i¼1
PN
þ f bias1 , where z = x o
P
i
j¼1 zj
2
þ f bias2 , where z = x o
ð100ðziþ1 z2i Þ2 þ ðzi 1Þ2 Þ þ f bias6 , where z = x o
2
i¼1 ðzi 10 cosð2pzi Þ þ 10Þ þ f bias9 , where z = x o
Search range
Optimum value
Category [16]
Landscape
xi 2 [100, 100]
min(f1) = f(0, . . . , 0) = 0
Unimodal
Fig. 5(a)
xi 2 [10, 10]
min(f2) = f(0, . . . , 0) = 0
Unimodal
Fig. 5(b)
xi 2 [100, 100]
min(f3) = f(0, . . . , 0) = 0
Fig. 5(c)
xi 2 [30, 30]
min(f4) = f(1, . . . , 1) = 0
Unimodal &
discontinues
Multimodal
xi 2 [100, 100]
min(f5) = f(0, . . . , 0) = 0
Unimodal
Fig. 5(e)
xi 2 [500, 500]
min(f6) = f(420.9687, . . . , 420.9687) = 12569.5
Multimodal
Fig. 5(f)
xi 2 [5.12, 5.12]
Fig. 5(d)
min(f7) = f(0, . . . ,0) = 0
Multimodal
Fig. 5(g)
min(f8) = f(0, . . . , 0) = 0
Multimodal
Fig. 5(h)
xi 2 [600, 600]
min(f9) = f(0, . . . , 0) = 0
Multimodal
Fig. 5(i)
xi 2 [5, 5]
min(f10) = f(0.08983,0.7126) = 1.0316285
Multimodal
Fig. 5(j)
xi 2 [100, 100]
min(f11) = f(o1, . . . , oN) = f_bias1 = 450
Unimodal
Fig. 5(k)
xi 2 [100, 100]
min(f12) = f(o1, . . . , oN) = f_bias6 = 450
Unimodal
Fig. 5(l)
min(f13) = f(o1, . . . , oN) = f_bias6 = 390
Multimodal
Fig. 5(m)
Multimodal
Fig. 5(n)
xi 2 [32, 32]
xi 2 [100, 100]
xi 2 [5, 5]
min(f14) = f(o1, . . . , oN) = f_bias9 = 330
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 7
Benchmark functions used to evaluate HS variations.
6103
6104
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
pi ¼
1
i1
;
gþ ðgþ g Þ
HMS
HMS 1
ð4Þ
P
where i is the rank index of the solution xi, "i 2 (1, 2, . . . , HMS). The assumption: HMS
i¼1 pi ¼ 1 and pi P 0, "i 2 (1, 2, . . . , HMS)
+
+
+
required that 1 6 g 6 2 and g = 2 g be fulfilled. Normally, the value of g determines the selective pressure which is
determined in advance and g+ = 1.1 is recommended [32].
(ii) Sampling method: the value of the decision variable x0i is sampled from the solution k in the HM, such as x0i ¼ xki ,
where k is chosen using the roulette wheel method shown in Algorithm 2 (i.e., the kth solution is selected according
to the accumulative probability that exceeds a generated random number between 0 and 1).
The main advantage of the linear rank selection scheme over the proportional selection scheme is that the selection probability of each solution is calculated according to its rank amongst the other solutions while completely ignoring its absolute
fitness value. Therefore, the selective pressure toward the better solutions in HM will remain constant during the search and
can be directly tuned. For example, g+ = 1 means the linear rank behaves like random selection scheme and the more the g+
is, the higher the selective pressure will be [20].
Example 3.5. In this selection scheme assume that we set g+ = 1.8 and g = 2.0 1.8 = 0.2. Table 5 shows the selection
probability of the HM solutions. The solution with the highest rank (i.e., minimum one in our example) has the highest
probability to be selected (see Fig. 3). Notice that the probability to select the best solution in HM is 36%, but in the
proportional selection example it was 60%.
3.6. Exponential ranking selection scheme
Exponential ranking selection scheme ranks the solution in the HM based on the parameter s. The best harmony in the
HM takes value of c1 = 1; the second best takes value of c2 = s which is normally initiated by s = 0.99; the third best takes a
value of c3 = s2, and so on where the worst harmony takes the value cHMS = sHMS1 [26]. The exponential ranking selection
scheme works as follows:
Table 8
Average and standard deviation (±SD) of the benchmark function results (N = 30).
RHS
GHS
PHS
THS
LHS
EHS
Sphere
0.000160
(0.000040)
0.006461
(0.030978)
0.000015
(0.000020)
0.000142
(0.000029)
0.000091
(0.000024)
0.000019
(0.000008)
Schwefel’s problem 2.22
0.034538
(0.003866)
0.045706
(0.006015)
0.002006
(0.001262)
0.037457
(0.004649)
0.031569
(0.006278)
0.007842
(0.002587)
Step
0.066667
(0.253708)
0.500000
(0.973795)
0
(0)
0.100000
(0.305129)
0
(0)
0
(0)
Rosenbrock
117.843170
(76.628913)
124.025995
(107.399321)
30.026920
(7.021281)
83.690387
(52.833365)
46.766364
(32.385750)
28.327790
(0.151823)
Rotated hyper-ellipsoid
1.699282
(2.357059)
2.292142
(2.944390)
0.000312
(0.000345)
0.657533
(0.707746)
0.096644
(0.092144)
0.002056
(0.000959)
Schwefel’s problem 2.26
12565.077806
(3.005930)
12558.505089
(4.829678)
2203.020353
(1975.703106)
12566.355475
(1.399210)
12566.276635
(1.514636)
9765.646522
(400.078376)
Rastrigin
0.023025
(0.008560)
0.196085
(0.374963)
0.001285
(0.003057)
0.055979
(0.181802)
0.016256
(0.004429)
0.004098
(0.001511)
Ackley
0.016520
(0.043104)
0.413872
(0.565199)
0.001441
(0.001359)
0.008976
(0.001106)
0.007014
(0.000901)
0.003184
(0.000896)
Griewank
1.013306
(0.005998)
1.029366
(0.013548)
1.0
(0)
1.010203
(0.004265)
1.000327
(0.001182)
1.0
(0)
Camel-Back
1.031628
(0)
1.031628
(0)
1.030303
(0.003514)
1.031628
(0)
1.031628
(0)
1.031628
(0)
Shifted Sphere
449.993429
(0.028288)
449.967505
(0.154257)
44.053145
(297.878663)
449.999864
(0.000032)
449.999853
(0.000035)
5801.689232
(1761.064344)
Shifted Schwefel’s problem 1.2
448.576769
(2.095817)
445.623957
(4.688459)
7.587749
(363.076144)
449.360161
(0.487774)
449.398467
(0.780534)
962643.85
(337586.125628)
Shifted Rosenbrock
708.424874
(509.815690)
669.236922
(268.145930)
7046753224.53
(5901733106.68)
524.65
(69.68)
1084.67
(1555.89)
538096756.73
(365049813.89)
Shifted Rastrigin
329.978672
(0.007523)
329.794801
(0.385365)
19.432458
(219.905715)
329.944092
(0.181181)
329.944487
(0.183369)
220.091523
(13.008728)
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Fig. 5. The Benchmark functions landscape where the value of N = 2 (2 dimensions) [16].
(i) Selection probability: the probability pi of each solution in HM is calculated as follows:
ci
pi ¼ PHMS
m¼1 c m
:
6105
6106
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
(ii) Sampling method: the value of the decision variable x0i is sampled from the solution k in the HM, such as x0i ¼ xki ,
where k is chosen using the roulette wheel method shown in Algorithm 2.
It is worth mentioning that the value of s determines the selective pressure, where less value of s leads to a higher
selective pressure toward the better solutions in HM. Similar to the linear ranking, exponential ranking is a static selection scheme where the pi, "i 2 (1, 2, . . . , HMS) is determined in advance and remains constant during the improvisation
process.
Example 3.6. In this selection scheme, assume that we set s = 0.9. Table 6 shows the selection probability of the HM
solutions. The solution with the highest rank (i.e., minimum one in our example) has the highest probability to be selected
(see Fig. 4). Notice that the probability for the next solutions in the rank decreases exponentially.
4. Experimental results
In this section, the novel selection schemes are experimentally evaluated. We can distinguish between six
variations of HS algorithm proposed here, each uses a particular selection scheme incorporated with memory
consideration:
1. Random Harmony Search (RHS): it uses the memory consideration with the random selection scheme as presented in the
basic HS algorithm provided by Geem et al. [1].
2. Global best Harmony Search (GHS): it uses the memory consideration with the global best selection scheme.
3. Proportional Harmony Search (PHS): it uses the memory consideration with the proportional selection scheme.
4. Tournament Harmony Search (THS): it uses the memory consideration with the tournament selection scheme.
5. Linear rank Harmony Search (LHS): it uses the memory consideration with the linear rank selection scheme.
6. Exponential rank Harmony Search (EHS): it uses the memory consideration with the exponential rank selection
scheme.
Table 9
Average and standard deviation (±SD) of the benchmark function optimization results (N = 100).
RHS
GHS
PHS
THS
LHS
EHS
Sphere
4892.013579
(700.314356)
5393.138129
(586.390725)
119.370522
(653.816515)
4380.318009
(552.126809)
852.302703
(352.041369)
0.001731
(0.000161)
Schwefel’s problem 2.22
42.459929
(2.879765)
44.695749
(2.871720)
0.086208
(0.007149)
39.629331
(2.764928)
6.683879
(3.580716)
0.236241
(0.027654)
Step
4425.333333
(852.587030)
5043.833333
(598.187266)
0
(0)
3948.2
(510.755316)
1018.6
(545.990943)
0
(0)
Rosenbrock
969493.55
(218828.81)
1027153.59
(224211.79)
98.936182
(0.029634)
668189.580
(173673.80)
277098.42
(145290.05)
98.91
(0.048918)
Rotated hyper-ellipsoid
6862896.283514
(850847.47)
6802092.41
(705232.53)
1.125397
(0.170532)
5860588.30
(708999.73)
1755382.89
(802470.66)
4.314254
(0.796516)
Schwefel’s problem 2.26
37588.628251
(440.572658)
36470.868644
(465.672247)
2908.950239
(691.709848)
37381.761069
(350.186404)
37414.901804
(527.578331)
15739.064
(715.492858)
Rastrigin
182.497453
(14.287768)
197.286827
(13.082067)
0.097375
(0.010190)
176.965641
(9.493890)
24.069768
(11.861607)
0.324566
(0.042217)
Ackley
8.895488
(0.473758)
9.150285
(0.269468)
0.009122
(0.000590)
8.490273
(0.297285)
4.254100
(0.974013)
0.017168
(0.001176)
Griewank
45.620787
(5.844016)
48.904743
(5.496883)
2.619533
(8.870547)
40.452075
(4.330871)
10.206330
(5.138715)
1.0
(0)
Camel-Back
1.031628
(0)
1.031628
(0)
1.030553
(0.002476)
1.031628
(0)
1.031628
(0)
1.031628
(0)
Shifted Sphere
5158.825040
(795.410406)
5613.799773
(719.834724)
278407.341820
(74663.594770)
4880.198954
(749.542902)
4669.191452
(664.643158)
126659.790727
(16588.027310)
Shifted Schhwefel’s problem
1.2
7833949.65
(1020500.74)
8125598.15
(835206.7)
821885275.41
(270240857.76)
7388540.03
(1294747.2)
6837621.25
(1123004.54)
300276334.35
(47957703.17)
Shifted Rosenbrock
116417446.08
(0605134.25)
85960756.91
(26417592.27)
70592340595.53
(25969177436.57)
100453092.05
(29267458.89)
97356117.09
(28689373.95)
48927513458.55
(9851096031.24)
Shifted Rastrigin
126.924707
(17.579457)
99.553717
(18.839925)
5.325311
(66.060053)
126.992802
(14.044825)
135.840579
(17.863249)
707.733716
(59.057874)
6107
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 10
Effect of HMS (f1, . . . , f5).
Sphere
HMS ?
5
10
20
50
RHS
0.000160
(0.000040)
0.006461
(0.030978)
0.000015
(0.000020)
0.000142
(0.000029)
0.000091
(0.000024)
0.000019
(0.000008)
0.036357
(0.072431)
0.002766
(0.006853)
0.006609
(0.036125)
0.000179
(0.000123)
0.000096
(0.000023)
0.000033
(0.000010)
0.642108
(0.445383)
0.000403
(0.000941)
0.000004
(0.000003)
0.000191
(0.000119)
0.000130
(0.000050)
0.000050
(0.000015)
2.855776
(1.088880)
0.000269
(0.000110)
0.000004
(0.000004)
0.000186
(0.000050)
0.000098
(0.000024)
0.000019
(0.000008)
0.034538
(0.003866)
0.045706
(0.006015)
0.002006
(0.001262)
0.037457
(0.004649)
0.031569
(0.006278)
0.007842
(0.002587)
0.032875
(0.005349)
0.050841
(0.016817)
0.001703
(0.001173)
0.034668
(0.003467)
0.030632
(0.005202)
0.010056
(0.002691)
0.038883
(0.004833)
0.051191
(0.010538)
0.001863
(0.001265)
0.033580
(0.004348)
0.030805
(0.009053)
0.015005
(0.003571)
0.053601
(0.012880)
0.048181
(0.008790)
0.001692
(0.001161)
0.036165
(0.004085)
0.031862
(0.005560)
0.007179
(0.002444)
0.066667
(0.253708)
0.500000
(0.973795)
0
(0)
0.100000
(0.305129)
0
(0)
0
(0)
0.100000
(0.305129)
1.100000
(1.561388)
0
(0)
0.166667
(0.379049)
0
(0)
0
(0)
0.166667
(0.379049)
1.166667
(1.440386)
0
(0)
0.200000
(0.484234)
0.033333
(0.182574)
0
(0)
0.433333
(0.773854)
1.500000
(1.502871)
0
(0)
0.266667
(0.449776)
0
(0)
0
(0)
117.843170
(76.628913)
124.025995
(107.399321)
30.026920
(7.021281)
83.690387
(52.833365)
46.766364
(32.385750)
28.327790
(0.151823)
220.475942
(364.969628)
93.678236
(57.248350)
32.553870
(20.868209)
90.586506
(67.123934)
54.232748
(56.258399)
28.121431
(0.144448)
255.064299
(452.166133)
128.756664
(95.996117)
32.770401
(21.937167)
109.353441
(99.081458)
60.211818
(97.682424)
28.172755
(0.138265)
243.081964
(187.608657)
152.331539
(155.738582)
37.771613
(49.056014)
137.846810
(166.294679)
67.200212
(114.059247)
28.334054
(0.177648)
1.699282
(2.357059)
2.292142
(2.944390)
0.000312
(0.000345)
0.657533
(0.707746)
0.096644
(0.092144)
0.002056
(0.000959)
22.566722
(44.672302)
2.302721
(2.143413)
0.179117
(0.979688)
2.181854
(2.558510)
0.185594
(0.273466)
0.004724
(0.001642)
115.826804
(87.433481)
2.623966
(2.807320)
0.000162
(0.000165)
1.040252
(1.182715)
0.265863
(0.417295)
0.006872
(0.002339)
689.390995
(331.129509)
1.998528
(2.310558)
0.000224
(0.000245)
1.307339
(1.305965)
0.123458919
(0.179185944)
0.002215
(0.000782)
GHS
PHS
THS
LHS
EHS
Schwefel’s problem 2.22
RMC
GHS
PHS
THS
LHS
EHS
Step
RHS
GHS
PHS
THS
LHS
EHS
Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Rotated hyper-ellipsoid
RHS
GHS
PHS
THS
LHS
EHS
All HS variations use the same parameter settings: HMS = 5, HMCR = 0.94, PAR = 0.3, FW = 0.01, and NI = 50,000.
These values are similar to what has been suggested in the state of the art methods [18,19,16,33]. For THS, the
tournament size is randomly chosen in each iteration from the range k 2 [1, HMS] [31]. For LHS, g+ = 1.1, this value
is recommended [32,31]. Furthermore, for EHS, s = 0.99 which is conventionally used in the literature [26]. Table 7
6108
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 11
Effect of HMS (f6, . . . , f10).
Schwefel’s problem 2.26
HMS ?
5
10
20
50
RHS
12565.077806
(3.005930)
12558.505089
(4.829678)
2203.020353
(1975.703106)
12566.355475
(1.399210)
12566.276635
(1.514636)
9765.646522
(400.078376)
12564.265655
(2.213120)
12559.441240
(4.556557)
2426.291663
(1959.720136)
12565.953538
(2.043225)
12565.143803
(1.578924)
11120.686040
(257.670599)
12561.564693
(3.374582)
12559.502545
(3.530191)
2616.553868
(1947.811844)
12567.638605
(1.057520)
12564.520220
(1.667966)
11755.926553
(192.684783)
12558.709396
(4.831471)
12557.482791
(5.169041)
2771.479103
(1888.479503)
12565.161477
(2.549264)
12566.312429
(2.042455)
9665.536039
(403.785597)
0.023025
(0.008560)
0.196085
(0.374963)
0.001285
(0.003057)
0.055979
(0.181802)
0.016256
(0.004429)
0.004098
(0.001511)
0.057812
(0.180192)
0.197699
(0.457671)
0.001946
(0.005292)
0.023606
(0.005902)
0.017953
(0.004397)
0.006924
(0.001839)
0.028560
(0.006662)
0.196527
(0.456079)
0.001547
(0.004451)
0.023959
(0.009061)
0.018674
(0.006980)
0.009927
(0.002500)
0.074707
(0.180819)
0.134627
(0.300670)
0.004762
(0.021636)
0.025075
(0.007525)
0.016375
(0.003051)
0.003618
(0.001566)
0.016520
(0.043104)
0.413872
(0.565199)
0.001441
(0.001359)
0.008976
(0.001106)
0.007014
(0.000901)
0.003184
(0.000896)
0.009195
(0.001052)
0.401566
(0.535652)
0.001576
(0.001523)
0.011610
(0.011013)
0.006977
(0.000869)
0.004356
(0.000542)
0.060283
(0.121080)
0.419704
(0.529091)
0.001887
(0.001787)
0.009720
(0.002356)
0.007343
(0.000889)
0.005112
(0.000788)
0.255074
(0.213804)
0.288598
(0.4654)
0.022335
(0.116335)
0.009594
(0.001417)
0.007248
(0.000630)
0.003278
(0.000601)
1.013306
(0.005998)
1.029366
(0.013548)
1.0
(0)
1.010203
(0.004265)
1.000327
(0.001182)
1.0
(0)
1.016792
(0.005291)
1.034955
(0.018194)
1.000285
(0.001562)
1.016277
(0.004926)
1.000556
(0.001237)
1.0
(0)
1.019490
(0.007272)
1.033735
(0.020249)
1.000815
(0.004462)
1.012887
(0.005959)
1.001526
(0.001891)
1.0
(0)
1.029672
(0.010497)
1.032718
(0.014583)
1.000926
(0.005074)
1.015080
(0.004517)
1.000276
(0.000632)
1.0
(0)
1.031628
(0)
1.031628
(0)
1.030303
(0.003514)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.029417
(0.005148)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.028544
(0.005353)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.027044
(0.006968)
1.031628
(0)
1.031628
(0)
1.031628
(0)
GHS
PHS
THS
LHS
EHS
Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
Ackley
RHS
GHS
PHS
THS
LHS
EHS
Griewank
RHS
GHS
PHS
THS
LHS
EHS
Camel-Back
RHS
GHS
PHS
THS
LHS
EHS
overviews a summary for 14 global minimization benchmark functions used to evaluate HS variations most of which
previously used in [18,19,16,33]. These benchmark functions provide a trad-off between unimodal and multimodal
functions. The benchmark functions were implemented with N = 30, with the exception of Six-Hump Camel-Back function which is two-dimensional.
6109
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 12
Effect of HMS (f11, . . . , f14).
Shifted Sphere
HMS ?
5
10
20
50
RHS
449.993429
(0.028288)
449.967505
(0.154257)
44.053145
(297.878663)
449.999864
(0.000032)
449.999853
(0.000035)
5801.689232
(1761.064344)
449.952173
(0.066867)
449.997914
(0.006297)
61.992347
(351.697802)
449.999836
(0.000047)
449.999558
(0.001111)
1564.074747
(564.247272)
449.080676
(0.715846)
449.975845
(0.117046)
65.941970
(325.644115)
449.999677
(0.000943)
449.803558
(0.292858)
563.030971
(333.673069)
447.346657
(0.997764)
449.998649
(0.004441)
88.298670
(356.051269)
449.999736
(0.000138)
449.999855
(0.000026)
6782.140049
(2246.940511)
448.576769
(2.095817)
445.623957
(4.688459)
7.587749
(363.076144)
449.360161
(0.487774)
449.398467
(0.780534)
962643.859146
(337586.125628)
440.575037
(12.064484)
446.900812
(2.371801)
48.748413
(366.028440)
449.306417
(0.620477)
446.233794
(10.729268)
238586.347616
(85736.815683)
342.039896
(83.134743)
447.007381
(2.969228)
30.589766
(250.455121)
448.947347
(1.054469)
430.097120
(19.033536)
109703.244954
(28581.765755)
133.717632
(264.256639)
445.681188
(3.581369)
12549052.147661
(7460382.813876)
448.456252
(1.718936)
449.199970
(0.896531)
797791.908693
(261548.135419)
708.424874
(509.815690)
669.236922
(268.145930)
7046753224.531870
(5901733106.686030)
524.653409
(69.681007)
1084.673868
(1555.891786)
538096756.733454
(365049813.891163)
773.660439
(410.812620)
639.603577
(219.631609)
6889896637.119450
(5865437364.049110)
612.299599
(311.906230)
621.604601
(241.341910)
34547026.133884
(27745770.571293)
1068.621323
(441.752368)
633.309018
(217.061108)
7103446635.977660
(5889942140.050610)
628.466939
(346.863811)
1594.394129
(2720.440038)
13325923.630313
(8825456.901390)
1688.826050
(765.688456)
736.164160
(502.833419)
8340784887.250140
(6283241721.998830)
732.084477
(479.936003)
1402.137436
(2895.136300)
493718995.794013
(307787653.539773)
329.978672
(0.007523)
329.794801
(0.385365)
19.432458
(219.905715)
329.944092
(0.181181)
329.944487
(0.183369)
220.091523
(13.008728)
329.978787
(0.003549)
329.760177
(0.480308)
19.810110
(220.432428)
329.976592
(0.007343)
329.877843
(0.301960)
271.293158
(11.461811)
329.939431
(0.180093)
329.791225
(0.375381)
26.033217
(228.850667)
329.977373
(0.006508)
329.909797
(0.251353)
292.070951
(5.894586)
329.928320
(0.185015)
329.825402
(0.343330)
43.677657
(217.643887)
329.976892
(0.005781)
329.945351
(0.183094)
220.502055
(19.241208)
GHS
PHS
THS
LHS
EHS
Shifted Schwefel’s problem 1.2
RHS
GHS
PHS
THS
LHS
EHS
Shifted Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Shifted Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
For experimental evaluations, extensive experiments were executed on an Intel 2 GHz Core 2 Quad processor with 4 GB of
RAM where the proposed HS variations were programmed in a Visual Basic source code for implementation as a VBA macro
in Microsoft Excel under Windows XP.
4.1. Comparison of HS variations
Table 8 summarizes the results of the HS variations using the 14 benchmark functions. Each HS variation runs 30 independent simulations where the numbers in the table refer to the averages and standard deviations (±SD) among them. The
best solutions (lowest is best) were highlighted in bold.
Apparently, PHS achieves 5 best results for Sphere, Schwefel problem 2.22, Rotated hyper-ellipsoid, Rastrigin, and Ackley
benchmark functions. Furthermore, it achieves the best results for Step function as do the LHS and EHS, both of which
achieve results close to those of PHS. Furthermore, the results demonstrate that PHS and EHS obtained the global optima
for Griewank. Additionally, EHS achieves the best results for Rosenbrock.
6110
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 13
Effect of HMCR (f1, . . . , f5).
Sphere
HMCR ?
0.5
0.7
0.9
0.94
0.99
RHS
13385.691929
(2146.077956)
14406.529011
(2216.422013)
3833.355787
(1536.059589)
14160.709048
(2065.217690)
13650.201071
(2166.098253)
13379.312301
(2501.620413)
8311.927426
(1356.416961)
3632.597243
(668.276661)
4182.798038
(674.562269)
38.700047
(57.834317)
3657.978865
(704.222747)
3617.508857
(604.039061)
3286.705496
(669.879655)
481.759146
(285.499584)
7.177465
(3.252494)
10.659350
(4.028915)
0.331216
(1.814097)
4.677996
(1.686677)
6.518513
(1.840438)
0.908095
(1.329842)
0.000045
(0.000011)
0.000160
(0.000040)
0.006461
(0.030978)
0.000015
(0.000020)
0.000142
(0.000029)
0.000174
(0.000089)
0.000091
(0.000024)
0.000019
(0.000008)
0.000054
(0.000011)
0.000079
(0.000014)
0.000001
(0.000001)
0.000054
(0.000014)
0.000053
(0.000011)
0.000038
(0.000007)
0.000006
(0.000004)
41.639832
(3.203389)
45.761401
(4.152859)
10.878600
(3.049702)
43.119330
(3.506341)
42.298595
(3.867776)
24.558445
(3.597564)
17.199224
(1.488143)
24.150648
(2.362887)
0.354016
(0.404194)
18.729076
(2.014781)
14.998578
(3.366113)
2.525801
(1.025773)
0.100211
(0.053129)
0.197292
(0.087142)
0.002824
(0.001950)
0.092001
(0.040555)
0.070642
(0.027949)
0.012099
(0.003456)
0.034538
(0.003866)
0.045706
(0.006015)
0.002006
(0.001262)
0.037457
(0.004649)
0.031569
(0.006278)
0.007842
(0.002587)
0.021108
(0.003145)
0.031184
(0.004261)
0.000298
(0.000480)
0.023720
(0.002626)
0.016705
(0.003584)
0.002746
(0.001645)
13576.1
(1916.775244)
13539.233333
(2108.729421)
4306.7
(1542.477768)
13920.466667
(1900.673197)
13016.466667
(1806.008568)
8440.166667
(1815.289434)
2969.766667
(567.516470)
3868.5
(809.058062)
24.966667
(48.713436)
3266.333333
(626.639771)
2776.633333
(587.233930)
439.066667
(257.966575)
3.7
(2.614944)
5.866667
(4.882999)
0.066667
(0.365148)
2.933333
(2.585548)
1.0
(1.339068)
0
(0)
0.066667
(0.253708)
0.5
(0.973795)
0
(0)
0.1
(0.305129)
0
(0)
0
(0)
0.866667
(1.569831)
3.8
(1.954658)
0
(0)
0.466667
(0.681445)
0
(0)
0
(0)
16106288.879178
(4092308.388348)
15550378.899778
(4054232.307771)
3045216.858275
(2249315.226145)
16901017.570312
(4104059.944059)
14493326.330574
(2879955.062135)
8925029.067344
(3175529.252219)
1166619.941830
(432261.233485)
1207334.453418
(459770.971943)
6491.623239
(11679.465486)
1122131.200951
(485067.318647)
1068623.563909
(432719.031885)
99295.792411
(76259.334135)
234.292452
(118.344928)
257.577912
(139.254415)
34.869515
(32.972775)
259.304117
(279.312686)
157.535061
(86.767240)
28.605786
(0.142966)
117.843170
(76.628913)
124.025995
(107.399321)
30.026920
(7.021281)
83.690387
(52.833365)
46.766364
(32.385750)
28.327790
(0.151823)
112.093375
(93.487657)
129.990442
(154.420529)
28.705396
(0.106917)
94.359608
(95.014819)
21.121689
(0.307178)
27.735304
(0.158584)
2071321.818398
(331568.210759)
2712378.998175
(404401.948675)
502872.490024
(199149.350594)
2259662.340477
(334501.239205)
2243477.592655
(427571.054013)
1297959.123786
(214244.549103)
452332.646285
(93732.407759)
549031.328285
(98172.563452)
2379.272682
(4782.371649)
441794.649055
(84914.608777)
387720.642166
(118582.390240)
44983.227650
(21783.9193)
1395.288265
(838.374323)
2019.995553
(692.761540)
22.577391
(123.658390)
805.206422
(567.015006)
228.677901
(556.331634)
0.006386
(0.002703)
1.699282
(2.357059)
2.292142
(2.944390)
0.000312
(0.000345)
0.657533
(0.707746)
0.096644
(0.092144)
0.002056
(0.000959)
0.127790
(0.161650)
0.133346
(0.118304)
0.000033
(0.000083)
0.066067
(0.056603)
0.006888
(0.003716)
0.000515
(0.000333)
GHS
PHS
THS
LHS
LHS
EHS
Schwefel’s problem 2.22
RHS
GHS
PHS
THS
LHS
EHS
Step
RHS
GHS
PHS
THS
LHS
EHS
Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Rotated hyper-ellipsoid
RHS
GHS
PHS
THS
LHS
EHS
For the Schwefel problem 2.26, its landscape houses a considerable number of local optimal solutions (see Fig. 5(f)), the
THS yields the best results. For the problem with few dimensions (i.e., Six-Hump Camel-Back function), all HS variations
achieved the global optima, except for PHS which does not seem to perform well in problems with negative objective
6111
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 14
Effect of HMCR (f6, . . . , f10).
Schwefel’s problem 2.26
HMCR ?
0.5
0.7
0.9
0.94
0.99
RHS
8303.350550
(360.847028)
7466.540469
(289.143266)
3524.055289
(343.364091)
8216.976080
(303.626384)
8136.751080
(334.836210)
6553.247474
(373.420156)
10776.905643
(332.829573)
9913.277768
(277.334706)
3051.180683
(373.145741)
10678.062535
(316.933667)
10674.482171
(290.202535)
7990.475154
(340.340718)
12540.393465
(11.477858)
12527.479645
(14.907758)
2440.739299
(1928.133062)
12547.680559
(7.649327)
12542.602541
(9.706397)
9663.167388
(333.880749)
12565.077806
(3.005930)
12558.505089
(4.829678)
2203.020353
(1975.703106)
12566.355475
(1.399210)
12566.276635
(1.514636)
9765.646522
(400.078376)
12569.380545
(0.397496)
12558.592263
(9.533928)
1771.722956
(682.868236)
12569.486607
(0.000003)
12569.486603
(0.000017)
8175.786826
(586.874435)
172.210442
(13.989671)
193.098615
(11.154884)
54.261724
(11.575599)
177.204324
(17.950621)
167.222467
(17.539388)
95.875402
(12.001916)
77.149308
(10.148819)
97.369166
(12.130768)
1.643696
(2.068213)
83.218505
(9.112650)
53.742464
(15.417749)
9.062587
(4.993626)
0.734054
(1.018480)
1.373929
(0.772191)
0.069624
(0.374484)
0.732212
(0.698371)
0.138169
(0.302808)
0.009813
(0.002806)
0.023025
(0.008560)
0.196085
(0.374963)
0.001285
(0.003057)
0.055979
(0.181802)
0.016256
(0.004429)
0.004098
(0.001511)
0.283342
(0.446301)
0.643486
(0.776714)
0.000056
(0.000114)
0.181564
(0.375826)
0.007314
(0.001892)
0.001002
(0.000603)
16.396270
(0.579309)
16.698272
(0.603631)
10.691385
(1.356335)
16.693556
(0.352639)
16.429132
(0.457300)
14.213152
(0.926789)
11.360224
(0.693393)
12.322284
(0.713047)
0.695067
(0.839222)
11.724200
(0.662869)
10.624278
(1.096440)
4.655906
(1.083871)
1.130017
(0.378863)
1.250415
(0.409927)
0.036798
(0.191172)
0.696607
(0.581026)
0.099773
(0.224880)
0.005060
(0.000719)
0.016520
(0.043104)
0.413872
(0.565199)
0.001441
(0.001359)
0.008976
(0.001106)
0.007014
(0.000901)
0.003184
(0.000896)
0.124098
(0.286872)
1.152401
(0.552550)
0.000182
(0.000303)
0.222890
(0.397543)
0.004657
(0.000737)
0.001587
(0.000584)
123.797913
(17.627900)
132.203527
(18.559296)
35.180631
(13.681459)
133.562186
(18.989586)
126.655631
(22.322125)
79.922554
(16.958112)
31.966569
(5.869759)
38.566093
(7.187923)
1.229053
(0.455415)
32.340364
(6.235961)
30.415470
(6.996733)
5.659690
(3.604465)
1.103008
(0.041158)
1.138027
(0.051657)
1.002088
(0.011434)
1.071751
(0.032548)
1.022861
(0.021871)
1.0
(0)
1.013306
(0.005998)
1.029366
(0.013548)
1.0
(0)
1.010203
(0.004265)
1.000327
(0.001182)
1.0
(0)
1.001733
(0.002652)
1.023487
(0.012445)
1.0
(0)
1.000130
(0.000516)
1.0
(0)
1.0
(0)
1.031628
(0)
1.031628
(0)
1.031363
(0.000552)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031433
(0.000562)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031331
(0.001080)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.030303
(0.003514)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.028199
(0.004752)
1.031628
(0)
1.031628
(0)
1.031628
(0)
GHS
PHS
THS
LHS
EHS
Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
Ackley
RHS
GHS
PHS
THS
LHS
EHS
Griewank
RHS
GHS
PHS
THS
LHS
EHS
Camel-Back
RHS
GHS
PHS
THS
LHS
EHS
function values. THS achieves best results for Shifted Sphere and Shifted Rosenbrock while LHS do so for Shifted Schwefel’s
problem 1.2 and Shifted Rastrigin.
It can be noticed that PHS fails to converge towards the local minima using Schwefel problem 2.26 and shifted benchmark
functions (i.e., benchmarks with negative optimum values). In general, the HS variations that rely on the survival of the
6112
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 15
Effect of HMCR (f11, . . . , f14).
Shifted Sphere
HMCR?
0.5
0.7
0.9
0.94
0.99
RHS
19630.795393
(3487.911419)
23072.571493
(3771.602898)
57650.551553
(10543.993398)
18594.695114
(3301.196427)
17920.970721
(2805.136835)
28287.141192
(5347.510754)
4099.899148
(943.666535)
4587.016496
(887.356027)
55295.491238
(16612.828395)
4190.842656
(976.198717)
3767.407463
(1060.013986)
15849.388852
(3083.085058)
440.801139
(4.437129)
435.857873
(6.823155)
51.223005
(304.903698)
444.047616
(3.277870)
444.003131
(2.571498)
6875.280860
(2085.974382)
449.993429
(0.028288)
449.967505
(0.154257)
44.053145
(297.878663)
449.999864
(0.000032)
449.999853
(0.000035)
5801.689232
(1761.064344)
449.999944
(0.000012)
449.999922
(0.000017)
37.881562
(327.639583)
449.999948
(0.000012)
449.999948
(0.000010)
9648.902039
(3551.016093)
2885066.800695
(547757.318472)
3474800.177185
(502236.100967)
12098172.855862
(2810527.276444)
2702792.044380
(531774.020856)
2778052.714359
(621631.033434)
5316697.941202
(1023675.084742)
541367.085876
(144130.896074)
665947.119087
(169487.312245)
12667039.378369
(4342679.961145)
529740.888224
(144290.901977)
534085.025449
(153878.464012)
2624643.829040
(561317.065535)
923.445319
(779.689891)
818.875706
(714.870085)
12519628.321040
(6913043.146503)
548.508886
(479.351228)
546.131889
(558.315875)
946598.695551
(309138.764095)
448.576769
(2.095817)
445.623957
(4.688459)
7.587749
(363.076144)
449.360161
(0.487774)
449.398467
(0.780534)
962643.859146
(337586.125628)
449.909240
(0.071694)
449.164700
(0.963215)
3.149386
(370.066516)
449.933552
(0.052206)
449.881405
(0.168799)
1462195.730501
(614484.813975)
2493508055.65
(671812125.10)
2704947010.80
(559299220.84)
11250821202.65
(2929600446.89)
2528546142.86
(749284094.78)
3080870183.75
(1081953171.84)
7444237142.56
(2608916736.73)
161593258.29
(53005324.12)
134732044.38
(28216680.37)
9925934290.28
(3691278276.48)
142655121.30
(60455692.29)
201877843.28
(64395609.22)
2782334691.19
(1290789519.36)
3665.25
(2056.82)
3853.28
(2228.08)
7317582585.23
(5851292954.48)
2530.25
(966.08)
4146.85
(2987.26)
586791506.70
(448348311.40)
708.42
(509.81)
669.23
(268.14)
7046753224.53
(5901733106.7)
524.65
(69.68)
1084.67
(1555.89)
538096756.73
(365049813.89)
515.19
(105.665219)
589.40
(309.94)
7989915411.29
(6726120405.2)
497.01
(106.87)
1106.80
(1876.38)
936285819.81
(859070474.69)
113.853612
(18.421266)
92.234926
(23.180497)
22.169026
(65.504743)
117.373058
(17.101148)
121.295565
(19.722645)
57.986591
(28.420962)
237.896037
(14.074702)
209.170882
(12.361670)
22.456136
(149.615234)
231.675296
(14.645463)
234.384802
(12.464922)
135.416742
(18.775319)
329.235541
(0.848076)
328.662812
(1.026431)
13.372751
(206.828562)
329.236875
(0.924232)
329.292073
(0.718992)
213.371518
(14.874533)
329.978672
(0.007523)
329.794801
(0.385365)
19.432458
(219.905715)
329.944092
(0.181181)
329.944487
(0.183369)
220.091523
(13.008728)
329.883883
(0.304559)
329.009433
(0.880254)
9.220330
(205.361164)
329.951564
(0.184239)
329.884316
(0.302282)
190.606780
(21.842108)
GHS
PHS
THS
LHS
EHS
shifted shwefel
RMC
GHS
PHS
THS
LHS
EHS
Shifted Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Shifted Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
fittest principle, with increasing selection pressure toward good solutions in HM, achieve better results than RHS. However,
GHS, which focuses on a single best solution during the search, reports the worse results because it reaches the stagnation
state quickly and thus gets trapped into a chronic premature convergence. It is worth mentioning that GHS quickly converges
to the optimal solution with a small variable problem; this is supported by [36]. On the other hand, for a large-variable problem, RHS is better than GHS, as indicated by Geem [37].
4.2. Scalability study: results for 100-dimensional problems
In this section, the effect of larger dimensions (i.e., N = 100) on the performance of the HS variations is investigated using
the same 14 functions. The same parameter values as N = 30 were used. Similar to the previous section, the results shown in
Table 9 were summarized in terms of averages and standard deviations over 30 experimental replications. Almost the same
performance is achieved by HS variations when N = 100 as is the case when N = 30. However, the results are not better than
those produced for 30-dimensional problems. In general, increasing the number of decision variables (or dimensionality) affects the results produced.
6113
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 16
Effect of PAR (f1, . . . , f5).
Sphere
PAR ?
0.1
0.3
0.5
0.7
0.9
RHS
0.133907
(0.202981)
1.346008
(1.393879)
0
(0)
0.017991
(0.029106)
0.000008
(0.000007)
0
(0)
0.000160
(0.000040)
0.006461
(0.030978)
0.000015
(0.000020)
0.000142
(0.000029)
0.000091
(0.000024)
0.000019
(0.000008)
0.000306
(0.000045)
0.000399
(0.000082)
0.000093
(0.000023)
0.000281
(0.000046)
0.000225
(0.000040)
0.000141
(0.000027)
0.000474
(0.000053)
0.000546
(0.000084)
0.000210
(0.000029)
0.000447
(0.000057)
0.000392
(0.000049)
0.000291
(0.000041)
1.472575
(0.621702)
0.000710
(0.000127)
0.000348
(0.000038)
0.000636
(0.000083)
0.000544
(0.000048)
0.000452
(0.000054)
0.008738
(0.001587)
0.011690
(0.002477)
0
(0)
0.008286
(0.000980)
0.004661
(0.002488)
0
(0)
0.034538
(0.003866)
0.045706
(0.006015)
0.002006
(0.001262)
0.037457
(0.004649)
0.031569
(0.006278)
0.007842
(0.002587)
0.063275
(0.005586)
0.080272
(0.020648)
0.018180
(0.002401)
0.062532
(0.010118)
0.063910
(0.018937)
0.033930
(0.005303)
0.083910
(0.005158)
0.106835
(0.021475)
0.041664
(0.003829)
0.085019
(0.008713)
0.083045
(0.011766)
0.064174
(0.005251)
0.117016
(0.006585)
0.136174
(0.039449)
0.072625
(0.004060)
0.103019
(0.005112)
0.099194
(0.007546)
0.089539
(0.005083)
0.166667
(0.461133)
1.233333
(1.330889)
0
(0)
0.166667
(0.379049)
0
(0)
0
(0)
0.066667
(0.253708)
0.5
(0.973795)
0
(0)
0.1
(0.305129)
0
(0)
0
(0)
0.466667
(0.628810)
0.6
(0.563242)
0
(0)
0.1
(0.305129)
0.033333
(0.182574)
0
(0)
0.3
(0.534983)
0.633333
(0.668675)
0
(0)
0.2
(0.406838)
0
(0)
0
(0)
0.2
(0.406838)
1.033333
(0.718395)
0
(0)
0.033333
(0.182574)
0.033333
(0.182574)
0
(0)
113.546312
(105.692823)
152.883849
(151.546040)
28.744643
(0.095267)
133.552966
(157.413585)
64.646110
(73.471376)
28.432161
(0.166898)
117.843170
(76.628913)
124.025995
(107.399321)
30.026920
(7.021281)
83.690387
(52.833365)
46.766364
(32.385750)
28.327790
(0.151823)
120.287603
(115.107853)
256.824615
(518.070593)
28.732086
(0.141087)
102.286522
(115.792191)
46.988788
(37.444652)
28.296981
(0.171988)
109.828521
(114.270002)
82.633893
(41.441610)
28.753828
(0.088401)
103.014173
(90.020970)
42.580754
(35.1159)
28.286736
(0.173620)
238.145233
(300.350529)
138.212136
(162.133049)
28.822421
(0.075403)
104.937745
(197.664022)
52.246348
(77.536016)
28.386760
(0.156310)
26.280221
(43.215604)
299.091275
(243.322493)
0
(0)
3.159110
(3.826037)
0.034586
(0.066560)
0
(0)
1.699282
(2.357059)
2.292142
(2.944390)
0.000312
(0.000345)
0.657533
(0.707746)
0.096644
(0.092144)
0.002056
(0.000959)
0.646677
(0.504016)
1.325384
(0.929971)
0.011611
(0.004470)
0.616927
(0.373182)
0.173462
(0.127058)
0.024971
(0.005341)
1.062798
(0.909879)
1.143171
(0.718233)
0.036255
(0.008140)
0.544081
(0.255314)
0.237000
(0.193782)
0.062492
(0.010963)
236.695652
(170.191723)
1.969047
(0.997719)
0.078743
(0.009617)
0.737874
(0.297685)
0.300925
(0.148477)
0.103862
(0.014820)
GHS
PHS
THS
LHS
EHS
Schwefel’s problem 2.22
RHS
GHS
PHS
THS
LHS
EHS
Step
RHS
GHS
PHS
THS
LHS
EHS
Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Rotated hyper-ellipsoid
RHS
GHS
PHS
THS
LHS
EMC
The experimental results lend support to the selective pressure as proposed in theory. This is substantiated by observing
improved results once the selective pressure is emphasized. RHS which uses random selection scheme has the weakest
selective pressure and almost yields the worst results.
6114
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 17
Effect of PAR (f6, . . . , f10).
Schwefel’s problem 2.26
PAR ?
0.1
0.3
0.5
0.7
0.9
RHS
12565.233443
(1.349985)
12555.246417
(4.201373)
1930.816247
(257.157573)
12564.609578
(2.817102)
12564.817663
(2.247382)
9754.924388
(447.051939)
12565.077806
(3.005930)
12558.505089
(4.829678)
2203.020353
(1975.703106)
12566.355475
(1.399210)
12566.276635
(1.514636)
9765.646522
(400.078376)
12567.246115
(1.345060)
12560.741672
(3.758277)
1984.641531
(330.429936)
12567.442546
(1.611844)
12568.044501
(1.153994)
9782.942586
(399.855744)
12567.364869
(1.049734)
12564.030872
(3.405302)
1991.777837
(336.090707)
12568.172572
(1.381856)
12568.271565
(0.850622)
9780.412165
(316.793690)
12559.767407
(4.473321)
12565.848625
(1.129113)
2597.534628
(656.294850)
12568.153924
(0.880607)
12568.420966
(0.863088)
9633.318907
(320.991393)
0.001548
(0.001087)
0.305786
(0.468188)
0
(0)
0.001211
(0.000538)
0.000523
(0.000285)
0
(0)
0.023025
(0.008560)
0.196085
(0.374963)
0.001285
(0.003057)
0.055979
(0.181802)
0.016256
(0.004429)
0.004098
(0.001511)
0.085392
(0.182081)
0.168768
(0.301249)
0.015859
(0.003800)
0.056915
(0.014787)
0.045016
(0.007181)
0.028988
(0.005622)
0.182526
(0.302866)
0.266863
(0.375605)
0.039874
(0.004374)
0.111697
(0.183145)
0.108075
(0.181295)
0.060753
(0.006383)
0.191263
(0.181621)
0.463059
(0.489460)
0.071921
(0.008010)
0.113154
(0.017680)
0.106054
(0.015541)
0.090489
(0.008980)
0.003624
(0.002018)
0.456250
(0.375673)
0
(0)
0.003423
(0.001698)
0.001573
(0.000534)
0
(0)
0.016520
(0.043104)
0.413872
(0.565199)
0.001441
(0.001359)
0.008976
(0.001106)
0.007014
(0.000901)
0.003184
(0.000896)
0.012789
(0.001032)
0.150562
(0.354139)
0.006436
(0.000675)
0.012913
(0.001144)
0.011277
(0.000972)
0.008591
(0.001140)
0.016900
(0.003452)
0.140894
(0.255022)
0.010872
(0.001339)
0.016601
(0.001533)
0.015209
(0.000973)
0.012849
(0.000899)
0.143087
(0.219096)
0.241919
(0.412374)
0.014294
(0.000993)
0.055304
(0.153902)
0.017591
(0.000871)
0.016310
(0.001023)
1.014512
(0.005651)
1.039656
(0.019701)
1.0
(0)
1.013661
(0.006576)
1.002474
(0.004162)
1.0
(0)
1.013306
(0.005998)
1.029366
(0.013548)
1.0
(0)
1.010203
(0.004265)
1.000327
(0.001182)
1.0
(0)
1.009190
(0.005345)
1.021360
(0.009361)
1.0
(0)
1.008389
(0.004365)
1.000043
(0.000179)
1.0
(0)
1.007701
(0.00451)
1.017232
(0.009338)
1.0
(0)
1.006921
(0.003574)
1.000094
(0.000432)
1.0
(0)
1.028939
(0.013368)
1.020562
(0.006543)
1.0
(0)
1.012213
(0.005109)
1.000003
(0.000017)
1.0
(0)
1.031628
(0)
1.031628
(0)
1.029234
(0.005282)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.030303
(0.003514)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.030860
(0.002570)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.030730
(0.002651)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.031628
(0)
1.027762
(0.006115)
1.031628
(0)
1.031628
(0)
1.031628
(0)
GHS
PHS
THS
LHS
EHS
Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
Ackley
RHS
GHS
PHS
THS
LHS
EHS
Griewank
RMC
GHS
PHS
THS
LHS
EHS
Camel-Back
RHS
GHS
PHS
THS
LHS
EHS
4.3. Effect of HMS
In this section, the effect of HMS value on the performance of the HS variations proposed is studied. Tables 10–12 show
the averages and standard deviations of the results produced by HS variations using the same benchmark functions. These
results were produced by using different HMS values with N = 30 dimension (i.e., HMS = 5, 10, 20, 50).
6115
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
Table 18
Effect of PAR (f11, . . . , f14).
Shifted Sphere
PAR ?
0.1
0.3
0.5
0.7
0.9
RHS
449.817532
(0.234094)
449.439492
(0.369575)
37.711737
(358.606810)
449.955280
(0.092139)
449.976422
(0.063928)
6984.000198
(2120.551452)
449.993429
(0.028288)
449.967505
(0.154257)
44.053145
(297.878663)
449.999864
(0.000032)
449.999853
(0.000035)
5801.689232
(1761.064344)
449.999683
(0.000048)
449.999601
(0.000101)
54.743213
(324.728222)
449.999726
(0.000041)
449.999725
(0.000048)
5611.359802
(2029.422282)
449.999509
(0.000058)
449.999475
(0.000087)
42.311130
(337.906182)
449.999570
(0.000058)
449.999555
(0.000072)
6563.620758
(2078.647603)
448.043796
(0.737213)
449.999324
(0.000122)
74.805037
(367.804507)
449.999374
(0.000073)
449.999409
(0.000077)
6295.249026
(2090.528931)
430.677249
(26.330923)
206.946802
(223.386091)
1949831.02
(3876873.5)
446.761979
(6.951156)
439.332083
(21.626743)
928822.805799
(297973.501687)
448.576769
(2.095817)
445.623957
(4.688459)
7.587749
(363.076144)
449.360161
(0.487774)
449.398467
(0.780534)
962643.859146
(337586.125628)
448.855564
(1.090062)
448.128331
(1.265316)
34.731239
(285.320661)
449.317878
(0.352197)
449.304998
(0.516655)
913914.043670
(290457.800608)
449.403779
(0.380474)
447.770182
(1.133260)
6.275093
(330.687741)
449.238001
(0.494402)
449.290948
(0.393044)
928231.495304
(369320.541721)
83.448795
(275.656180)
448.165192
(1.179430)
12508134.9
(7604780.1)
448.795930
(1.077333)
449.205499
(0.370895)
1041858.849651
(315763.312884)
729.969018
(479.505096)
945.255958
(407.377100)
7059855473.3
(5870384793.6)
674.669203
(432.042375)
1634.948076
(3079.286611)
403156196.65
(235698987.39)
708.424874
(509.815690)
669.236922
(268.145930)
7046753224.5
(5901733106.6)
524.653409
(69.681007)
1084.673868
(1555.891786)
538096756.73
(365049813.89)
620.851587
(352.308319)
694.562404
(462.945336)
6719715916.1
(5693997924.1)
546.944009
(134.985693)
954.953968
(916.290731)
514202831.61
(363572826.43)
695.432182
(526.802690)
549.263488
(170.973270)
6933607565.0
(5898889020.1)
641.791280
(386.548871)
1119.675226
(1702.435563)
429936977.92
(317339159.69)
1531.118485
(888.793809)
712.867623
(509.517215)
7798030788.6
(6093887860.2)
541.736799
(175.120720)
1255.158336
(2476.202130)
541466302.97
(310135184.51)
329.965406
(0.181450)
329.958546
(0.188315)
15.006334
(221.235979)
329.998020
(0.002187)
329.965614
(0.181384)
219.259305
(18.619953)
329.978672
(0.007523)
329.794801
(0.385365)
19.432458
(219.905715)
329.944092
(0.181181)
329.944487
(0.183369)
220.091523
(13.008728)
329.947171
(0.012640)
329.837006
(0.399682)
13.787341
(208.002183)
329.947383
(0.009601)
329.948647
(0.010063)
220.552624
(17.626826)
329.913817
(0.020574)
329.677264
(0.498480)
11.512577
(212.295709)
329.880879
(0.180608)
329.882239
(0.180049)
211.760458
(14.807523)
329.831903
(0.031343)
329.749379
(0.301020)
50.214063
(217.323728)
329.805066
(0.276289)
329.878337
(0.017209)
217.143352
(15.882217)
GHS
PHS
THS
LHS
EHS
Shifted Schwefel’s problem 1.2
RHS
GHS
PHS
THS
LHS
EHS
Shifted Rosenbrock
RHS
GHS
PHS
THS
LHS
EHS
Shifted Rastrigin
RHS
GHS
PHS
THS
LHS
EHS
The results demonstrate that the HS variations almost have similar performance with various values of HMS. In other
words, The HS variations are not quite sensitive to the HMS setting where no single HMS value is universally recommended
for such kind of problems.
As observed in [18,16], using HM with a small number of solutions gives a better choice for the HS variations in order to
reduce the memory space. This is because it is logical to use a small value of HMS since the short-term memory of the musicians is small and HMS imitates the small size of the human short-term memory.
4.4. Effect of HMCR
The performance of the HS variations using different HMCR values is investigated in this section. The results for the same
14 benchmark functions using varying HMCR values (i.e., 0.5, 0.7, 0.9, 0.94, 0.99) are summarized in Tables 13–15. Once more,
the results are reported in terms of averages and standard deviations of 30 experimental independent replications.
In general, the performance of the proposed HS variations is improved by increasing the HMCR value. However
HMCR = 0.99 is not recommended. A plausible explanation might rely on exploration and exploitation search concepts.
6116
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
The larger value of HMCR means the probability of using the HM is high and thus exploration will be decreased. In contrast,
using a small value of HMCR increases the diversity and thus hinders the convergence speed. In the case of HMCR = 0.99, the
diversity is almost lost as the HS variations get easily stuck in the local minima.
4.5. Effect of PAR
In this section, the performance of the HS variations using different PAR values is studied. The results for all functions
using five PAR values (i.e., 0.1, 0.3, 0.5, 0.7, 0.9) are summarized in Tables 16–18. The results are summarized in terms of averages and standard deviations of 30 experimental independent replications. The results show that there is no single superior
value of PAR applicable to all functions as the HS variations are not sensitive to the PAR.
5. Conclusion and future work
This paper proposed new variations of harmony search (HS) based on different selection schemes. Each variation is a HS
with a selection scheme incorporated with the memory consideration process. These are Global best Harmony Search (GHS),
Proportional Harmony Search (PHS), Tournament Harmony Search (THS), Linear rank Harmony Search (LHS), and Exponential rank Harmony Search (EHS). The proposed HS variations employed the natural selection principle of ‘survival of the fittest’ to generate the new harmony by means of focusing on the better solutions stored in harmony memory (HM). The
experiments are conducted using global benchmark functions widely used in the literature. The experimental results show
that incorporating the proposed selection scheme to the process of memory consideration gives direct impact to the HS performance. The effect of various parameters on the performance of the HS variations is thoroughly studied where the results
show that the HS variations are less parametrically sensitive, specifically HMS and PAR. However, increasing the value of
HMCR parameter leads to better solutions in all HS variations.
As this study is an initial exploration of selection schemes in HS algorithm, future work can be directed to analyze these
selection schemes in terms of takeover time [38], reproduction rate, loss of diversity, selection variance, and selective pressure [31]. The parameters of the rank selection schemes (i.e., k in the tournament selection, g+ in the linear rank, s in the
exponential rank selection schemes) can also be studied in terms of the selective pressure provided. In general PHS is not
applicable when the negative objective function value is possible. Therefore, a scaling mechanism should be investigated
in the future so as to overcome this problem [20].
References
[1] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 60–68.
[2] M. Mahdavi, M. Fesanghary, E. Damangir, An improved harmony search algorithm for solving optimization problems, Applied Mathematics and
Computation 188 (2) (2007) 1567–1579.
[3] Z.W. Geem, Novel derivative of harmony search algorithm for discrete design variables, Applied Mathematics and Computation 199 (1) (2008) 223–
230.
[4] X.-S. Yang, Harmony search as a metaheuristic algorithm, in: Z.W. Geem (Ed.), Music-Inspired Harmony Search Algorithm, SCI, vol. 191, SpringerVerlag, Berlin, Heidelberg, 2009, pp. 1–14.
[5] K.S. Lee, Z.W. Geem, A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice, Computer
Methods in Applied Mechanics and Engineering 194 (36–38) (2005) 3902–3933.
[6] A.A. Taleizadeh, S.T.A. Niaki, F. Barzinpour, Multiple-buyer multiple-vendor multi-product multi-constraint supply chain problem with stochastic
demand and variable lead-time: a harmony search algorithm, Applied Mathematics and Computation 217 (22) (2011) 9234–9253.
[7] M.A. Al-Betar, A.T. Khader, A harmony search algorithm for university course timetabling, Annals of Operation Research (2010). 1-2910.1007/s10479010-0769-z.
[8] M.A. Al-Betar, A.T. Khader, J.J. Thomas, A combination of metaheuristic components based on harmony search for the uncapacitated examination
timetabling, in: 8th International Conference on the Practice and Theory of Automated Timetabling (PATAT 2010), Belfast, Northern Ireland, 2010.
[9] M.A. Al-Betar, A.T. Khader, I.Y. Liao, A harmony search algorithm with multi-pitch adjusting rate for university course timetabling, in: Z.W. Geem (Ed.),
Recent Advances In Harmony Search Algorithm, SCI, vol. 270, Springer-Verlag, Berlin, Heidelberg, 2010, pp. 147–162.
[10] M.A. Al-Betar, A.T. Khader, F. Nadi, Selection mechanisms in memory consideration for examination timetabling with harmony search, in: GECCO ’10:
Proceedings of Genetic and Evolutionary Computation Conference, ACM, Portland, Oregon, USA, 2010.
[11] L. Wang, Q.-K. Pan, M.F. Tasgetiren, A hybrid harmony search algorithm for the blocking permutation flow shop scheduling problem, Computers &
Industrial Engineering 61 (1) (2011) 76–83.
[12] G. Ingram, T. Zhang, Overview of applications and developments in the harmony search algorithm, in: Z.W. Geem (Ed.), Music-Inspired Harmony
Search Algorithm, SCI, vol. 191, Springer-Verlag, Berlin, Heidelberg, 2009, pp. 15–37.
[13] Z.W. Geem, State-of-the-art in the structure of harmony search algorithm, in: Z. Geem (Ed.), Recent Advances In Harmony Search Algorithm, SCI, vol.
270, Springer-Verlag, Berlin, Heidelberg, 2010, pp. 1–10.
[14] O. Alia, R. Mandava, The variants of the harmony search algorithm: an overview, Artificial Intelligence Review 36 (2011) 49–68.
[15] Z.W. Geem, K.-B. Sim, Parameter-setting-free harmony search algorithm, Applied Mathematics and Computation 217 (8) (2010) 3881–3889.
[16] Q.-K. Pan, P. Suganthan, M.F. Tasgetiren, J. Liang, A self-adaptive global best harmony search algorithm for continuous optimization problems, Applied
Mathematics and Computation 216 (3) (2010) 830–848.
[17] B. Alatas, Chaotic harmony search algorithms, Applied Mathematics and Computation 216 (9) (2010) 2687–2699.
[18] M.G.H. Omran, M. Mahdavi, Global-best harmony search, Applied Mathematics and Computation 198 (2) (2008) 643–656.
[19] S. Das, A. Mukhopadhyay, A. Roy, A. Abraham, B.K. Panigrahi, Exploratory power of the harmony search algorithm: analysis and improvements for
global numerical optimization, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 41 (1) (2011) 89–106.
[20] T. Bäck, Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford University Press,
Oxford, UK, 1996.
[21] J. Kennedy, R. Eberhart, Particle swarm optimization, in: Proceedings., IEEE International Conference on Neural Networks, 1995, pp. 1942–1948.
M.A. Al-Betar et al. / Applied Mathematics and Computation 218 (2012) 6095–6117
6117
[22] R. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micro Machine and
Human Science, 1995, pp. 39–43.
[23] J.H. Holland, Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor, 1975.
[24] T. Bäck, F. Hoffmeister, H.-P. Schwefel, Extended selection mechanisms in genetic algorithms, in: B.L. Belew, R.K. (Ed.), Proceedings of the Fourth
International Conference on Genetic Algorithms, Morgan Kaufmann, San Mateo, 1991, pp. 92–99.
[25] B.L. Miller, D.E. Goldberg, Genetic algorithms, selection schemes, and the varying effects of noise, Evolutionary Computation 4 (1996) 113–131.
[26] P.J.B. Hancock, An empirical comparison of selection methods in evolutionary algorithms, in: Selected Papers from AISB Workshop on Evolutionary
Computing, Springer-Verlag, London, UK, 1994, pp. 80–94.
[27] J.E. Baker, Adaptive selection methods for genetic algorithms, in: Proceedings of the 1st International Conference on Genetic Algorithms, L. Erlbaum
Associates Inc., Hillsdale, NJ, USA, 1985, pp. 101–111.
[28] D. Goldberg, K. Deb, B. Korb, Messy genetic algorithms: motivation, analysis, and first results, Complex Systems (3) (1989) 493–530.
[29] A. Sokolov, D. Whitley, A. Motta Salles Barreto, A note on the variance of rank-based selection strategies for genetic algorithms and genetic
programming, Genetic Programming and Evolvable Machines 8 (2007) 221–237.
[30] A.E. Eiben, J.E. Smith, Introduction to Evolutionary Computing, Springer-Verlag, 2003.
[31] T. Blickle, L. Thiele, A comparison of selection schemes used in evolutionary algorithms, Evolutionary Computation 4 (4) (1997) 361–394.
[32] T. Bäck, Selective pressure in evolutionary algorithms: A characterization of selection mechanisms, in: Proceedings of the First IEEE Conference on
Evolutionary Computation, IEEE Press, 1994, pp. 57–62.
[33] D. Zou, L. Gao, J. Wu, S. Li, Novel global harmony search algorithm for unconstrained problems, Neurocomputing 73 (16–18) (2010) 3308–3318.
[34] X. Yao, Y. Liu, G. Lin, Evolutionary programming made faster, IEEE Transactions on Evolutionary Computation 3 (2) (1999) 82–102.
[35] P.N. Suganthan, N. Hansen, J.J. Liang, K. Deb, Y.-P. Chen, A. Auger, S. Tiwari, Problem definitions and evaluation criteria for the cec 2005 special session
on real-parameter optimization, Technical Report KanGAL Report#2005005, IIT Kanpur, India, Nanyang Technological University, Singapore, 2005.
[36] Z.W. Geem, Particle-swarm harmony search for water network design, Engineering Optimization 41 (4) (2009) 297–311.
[37] Z.W. Geem, Optimal cost design of water distribution networks using harmony search, Engineering Optimization 38 (3) (2006) 259–280.
[38] D.E. Goldberg, K. Deb, A comparative analysis of selection schemes used in genetic algorithms, in: Foundations of Genetic Algorithms, Morgan
Kaufmann, 1991, pp. 69–93.