[go: up one dir, main page]

0% found this document useful (0 votes)
7 views4 pages

Sample Springer

Download as docx, pdf, or txt
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 4

A New Particle Acceleration-based Particle Swarm

Optimization Algorithm

1
Avinash Tripathi ,

1
Avinash Tripathi, XYX Institute of Technology, Gwalior, India

1
shail.tripathi@yahoo.com,

Abstract. Optimization of one or more objective function is a requirement for


many real life problems. Due to their wide applicability in business,
engineering and other areas, a number of algorithms have been proposed in

Keywords: PSO, PSO-TVAC, Parameter tuning

1 Introduction

An optimization problem is to find the best solution from all feasible solutions, which
is an optimal value of objective function and may be either maximum or minimum.
The objective function, its search space, and the constraints are all parts of the
optimization problem. A local optimum of an optimization problem is a solution that
is optimal (either maximal or minimal) within a neighboring set of solutions. The
functions and provides results for the proposed approach on benchmarks functions
and finally in section 5, conclusion are drawn and future work is suggested.

2 Background Details & Related Work

PSO is a robust optimization technique that applies the social intelligence of swarms
to solve optimization problems. First version of this algorithm was proposed by J.
convergence is detected. A new PSO algorithm called PSO with neighborhoods
search strategies (NSPSO), which utilizes one local and two global neighborhood
, which is a combination of Particle Swarm Optimization (PSO) with strategies of
scatter search and path re-linking to create more efficient variant of PSO. Mohammed
El-Abd et al. introduced PSO-Bounds that are based on the concept of Population-
Based Incremental Learning (PBIL) that allow PSO algorithm to adjust the search
space boundary as the search progresses [17, 18]. Xu et al. proposed an adaptive
parameter tuning in APSO-VI [20] by introducing velocity information defined as the
average absolute value of velocity of all its particles. Lin et al. [19] proposed a local
and global search combined PSO (LGSCPSOA), analyzed its convergence, and
2 1Avinash Tripathi ,

obtained its convergence qualification. A variant of PSO-TVAC has been proposed


and used in software test case generation process in [21]

3 Proposed Approach

Each optimization algorithm is designed to minimize searching time to reach to the


global optimal solution. The proposed algorithm PSOM is also designed to fulfill
these requirements. To improve the convergence rate of existing PSO, a change has
(6)

Where a is the acceleration

Initially, population of particles is generated randomly and each particle is associated


(t becomes equal to Imax).
17. The global best position of the swarm is the optimal solution.

4 Experimental Setup and Results

To check the performance of proposed algorithm with state of art algorithms, BBOB
noiseless test-bed is used. Optimal value of particles position is searched in closed
proposed algorithm has been taken from the COCO framework [22].
Results:

Fig. 1. Improved results of PSOM on 2-D and 3-D


Error! Use the Home tab to apply title to the text that you want to appear here. 3

Fig. 1. Improved Results of PSOM on 5-D

Fig. 1. Improved Results of PSOM on 10-D

Fig. 1. Improved Results of PSOM on 20-D

5 Conclusions
From the results it is clear that on dimension 2 the performance of PSOM has
improved over PSO on multimodal functions. However its convergence rate has

References
1. J. Kennedy and R. Eberhart, “Particle swarm optimization”, in Proc. IEEE Int. Conf.
Neural Networks, 1995, pp. 1942-1948.
2. Van Den Bergh, Frans. “An analysis of particle swarm optimizers”, PhD diss., University
of Pretoria, 2006.
3. R. Brits, A.P. Engelbrecht, and F. van den Bergh, “A Niching Particle Swarm Optimizer”,
in Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning
(SEAL02), volume 2, pages 692-696.
4. http://en.wikipedia.org/wiki/Particle swarm optimization.
5. Zhang, Jun, De-Shuang Huang, Tat-Ming Lok, and Michael R. Lyu, “A novel adaptive
sequential niche technique for multimodal function optimization”, Neurocomputing 69,
no. 16 (2006): 2396-2401.
6. S. Bird and X. Li, “Adaptively choosing niching parameters in a PSO”, In Proceedings of
the 8th annual conference on Genetic and evolutionary computation, pages 310. ACM
NewYork, NY, USA, 2006.
7. Evers, George I., and M. Ben Ghalia, “Regrouping particle swarm optimization: a new
global optimization algorithm with improved performance consistency across
4 1Avinash Tripathi ,

benchmarks”, Systems, Man and Cybernetics, 2009.SMC 2009. IEEE International


Conference on. IEEE, 2009.

You might also like