Sample Springer
Sample Springer
Sample Springer
Optimization Algorithm
1
Avinash Tripathi ,
1
Avinash Tripathi, XYX Institute of Technology, Gwalior, India
1
shail.tripathi@yahoo.com,
1 Introduction
An optimization problem is to find the best solution from all feasible solutions, which
is an optimal value of objective function and may be either maximum or minimum.
The objective function, its search space, and the constraints are all parts of the
optimization problem. A local optimum of an optimization problem is a solution that
is optimal (either maximal or minimal) within a neighboring set of solutions. The
functions and provides results for the proposed approach on benchmarks functions
and finally in section 5, conclusion are drawn and future work is suggested.
PSO is a robust optimization technique that applies the social intelligence of swarms
to solve optimization problems. First version of this algorithm was proposed by J.
convergence is detected. A new PSO algorithm called PSO with neighborhoods
search strategies (NSPSO), which utilizes one local and two global neighborhood
, which is a combination of Particle Swarm Optimization (PSO) with strategies of
scatter search and path re-linking to create more efficient variant of PSO. Mohammed
El-Abd et al. introduced PSO-Bounds that are based on the concept of Population-
Based Incremental Learning (PBIL) that allow PSO algorithm to adjust the search
space boundary as the search progresses [17, 18]. Xu et al. proposed an adaptive
parameter tuning in APSO-VI [20] by introducing velocity information defined as the
average absolute value of velocity of all its particles. Lin et al. [19] proposed a local
and global search combined PSO (LGSCPSOA), analyzed its convergence, and
2 1Avinash Tripathi ,
3 Proposed Approach
To check the performance of proposed algorithm with state of art algorithms, BBOB
noiseless test-bed is used. Optimal value of particles position is searched in closed
proposed algorithm has been taken from the COCO framework [22].
Results:
5 Conclusions
From the results it is clear that on dimension 2 the performance of PSOM has
improved over PSO on multimodal functions. However its convergence rate has
References
1. J. Kennedy and R. Eberhart, “Particle swarm optimization”, in Proc. IEEE Int. Conf.
Neural Networks, 1995, pp. 1942-1948.
2. Van Den Bergh, Frans. “An analysis of particle swarm optimizers”, PhD diss., University
of Pretoria, 2006.
3. R. Brits, A.P. Engelbrecht, and F. van den Bergh, “A Niching Particle Swarm Optimizer”,
in Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning
(SEAL02), volume 2, pages 692-696.
4. http://en.wikipedia.org/wiki/Particle swarm optimization.
5. Zhang, Jun, De-Shuang Huang, Tat-Ming Lok, and Michael R. Lyu, “A novel adaptive
sequential niche technique for multimodal function optimization”, Neurocomputing 69,
no. 16 (2006): 2396-2401.
6. S. Bird and X. Li, “Adaptively choosing niching parameters in a PSO”, In Proceedings of
the 8th annual conference on Genetic and evolutionary computation, pages 310. ACM
NewYork, NY, USA, 2006.
7. Evers, George I., and M. Ben Ghalia, “Regrouping particle swarm optimization: a new
global optimization algorithm with improved performance consistency across
4 1Avinash Tripathi ,