Ai Pso
Ai Pso
Ai Pso
Abstract - In this paper, the adaptation of network weights using Particle Swarm Optimization (PSO) was proposed as
a mechanism to improve the performance of Artificial Neural Network (ANN) in classification of IRIS dataset.
Classification is a machine learning technique used to predict group membership for data instances. To simplify the
problem of classification neural networks are being introduced. This paper focuses on IRIS plant classification using
Neural Network. The problem concerns the identification of IRIS plant species on the basis of plant attribute
measurements. Classification of IRIS data set would be discovering patterns from examining petal and sepal size of
the IRIS plant and how the prediction was made from analyzing the pattern to form the class of IRIS plant. By using
this pattern and classification, in future upcoming years the unknown data can be predicted more precisely. Artificial
neural networks have been successfully applied to problems in pattern classification, function approximations,
optimization, and associative memories. In this work, Multilayer feed- forward networks are trained using back
propagation learning algorithm.
Keywords - Artificial neural network, particle swarm optimization, machine learning, back-propagation, IRIS.
I. INTRODUCTION
We view particle swarm optimization as a mid-level form of A-life or biologically derived algorithm, occupying the
space in nature between evolutionary searches, which requires neural processing, which occurs on the order of
milliseconds. Social optimization occurs in the time frame of ordinary experience - in fact, it is ordinary experience. In
addition to its ties with A-life, particle swarm optimization has obvious ties with evolutionary computation. Conceptually,
it seems to lie somewhere between genetic algorithms and evolutionary programming. Here we describe the use of back
propagation neural networks (BPNN) towards the identification of iris plants on the basis of the following measurements:
sepal length, sepal width, petal length, and petal width. There is a comparison of the fitness of neural networks with input
data normalized by column, row, sigmoid, and column constrained sigmoid normalization. Also contained within the
paper is an analysis of the performance results of back propagation neural networks with various numbers of hidden layer
neurons, and differing number of cycles (epochs). The analysis of the performance of the neural networks is based on
several criteria: incorrectly identified plants by training set (recall) and testing set (accuracy), specific error within
incorrectly identified plants, overall data set error as tested, and class identification precision.
A. Artificial Intelligence :
A precise definition of intelligence is unavailable. It is probably explained best by discussing some of the aspects. In
general, intelligence has something to do with the process of knowledge and thinking, also called cognition. These
mental processes are needed for, i.e., solving a mathematical problem or playing a game of chess. One needs to possess a
certain intelligence to be able to do these tasks. Not only the deliberate thought processes are part of cognition, also the
unconscious processes like perceiving and recognizing an object belong to it.
B. Flow Chart:
H1
Fi=100
H2
H3
Fig 2: Process Normalization
Setosa 010
Versicolor 100
Virginnica 001
C. Classification performance:
As shown in this plot at the Epoch 46 the validation performance returns less Mean square Error. Mean square error is
the average square between output & target. The projected result for 54 Epoch we get the test data matrix with the
accuracy rate of classified pattern of 97.3%
Fig 3: Plot of error per iteration
VI. CONCLUSION
Particle swarm optimization is an extremely simple algorithm that seems to be effective for optimizing a wide range of
functions. The adjustment pi toward and pg by the particle swarm optimizer is conceptually similar to the crossover
operation utilized by genetic algorithms. It uses the concept of fitness, as do all evolutionary computation paradigms.
Unique to the concept of particle swarm optimization is flying potential solutions through hyperspace, accelerating
toward "better" solutions. In this simulation, we demonstrated the efficiency that this method possesses. Lastly, this
method can be employed in training in various ANNs with different topologies.