5.1 Parameter Optimization of ANN Using PSO
It is noticed that weight (w) and bias (b) parameters have great
influence on the performance of ANN. In this work, PSO method is
employed to optimize the parameters of ANN. PSO is a populated search
method, which derives from the research for the movement of organisms in
a bird flocking or fish schooling [33]. The method is very easy to
implement, and having very few parameters to adjust. PSO performs
searches using a population (called swarm) of individuals (called
particles) that are updated from iteration to iteration. To discover the
optimal solution, each particle moves in the direction of its previous
best position (pbest) and its best global global position (gbest). The
velocity and position of particles can be updated by the following
equations:
\begin{equation}
V_{i,j}(t+1)=W*V_{i,j}(t)+c_{1}*r_{1}*\left(X_{\text{pbest}}(t)-X_{i,j}(t)\right)+c_{2}*r_{2}*\left(X_{\text{gbest}}(t)-X_{i,j}(t)\right)\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (3)\nonumber \\
\end{equation} \begin{equation}
X_{i,j}(t+1)=\ X_{i,j}(t)+V_{i,j}(t+1)\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (4)\nonumber \\
\end{equation}
Where “t” denotes the iteration counter,vij is the
velocity of particle “i” on the jth dimension, whose
value is limited to the range of [vmin;
vmax]; pij is the position of particle
“i” on the jth dimension, whose value is limited to
the range [Xmin; Xmax];
Xpbes is the pbest position of particle “I” on the
jth dimension, and Xgbest is the gbest
position of the swarm on the jth dimension. The
inertia weight “w” is used to balance the global exploration and local
exploitation. The r1 and r2 are random
function in the range [0, 1], b is constraint factor used to control
the velocity weight, whose value is usually set to 1. Positive constant
c1 and c2 are personal and social
learning factors, whose values are usually set to 2. Here, the particle
is composed of the parameters w and b. Fig. 3 presents the process of
optimizing the ANN parameters with PSO, which is described below:
Figure 3: PSO based Parameter optimizationprocess of ANN
The main steps of PSO based parameter optimization process are
summarized as
Initialization
In this step, different parameters of PSO are initialized with a
population of random particles and velocities.
Train the ANN model and evaluate the fitness function
ANN model is trained with the parameters c and r included in current
particle. The 10-fold cross validation technique is applied to
evaluate fitness function. In 10-old cross validation, the training
data set is randomly divided into 10 mutually exclusive subsets of
approximately equal size, in which 9 subsets are used to train the
data and the last subset is used as to test the data. The
above-mentioned procedure is repeated 10 times, so that each subset is
used once for testing. The fitness function is defined as the
1-CAvalidation of the 10-fold cross validation method
on the training data set, which is shown in equation 5. The solution
with a bigger CAvalidation has a smaller fitness
value.
\begin{equation}
Fitness=1-\text{CA}_{\text{validation}}\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (5)\nonumber \\
\end{equation} \begin{equation}
\text{CA}_{\text{validation}}=1-\frac{1}{10}\sum_{i=1}^{10}\left|\frac{y_{c}}{y_{c}+y_{f}}\right|\times 100\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (6)\nonumber \\
\end{equation}
Where, yc and yf represent the number of
true and false classifications respectively.
Update the global and personal best positions
In this step, the global best and personal best positions of particles
are updated according to the fitness function values.
Update the velocity and position
The position and velocity of each particle is updated using the
equations 3-4 and obtained the new positions of particles for further
iterations.
Termination Condition.
Repeat steps 2 to4, until termination conditions are not satisfied.