**5. Particle swarm algorithm**

Particle swarm optimization, first introduced by Kennedy and Eberhart [23], is a synthetic meta-heuristic approach to global computer optimization, belonging to the swarm intelligence concept-based algorithm family of approaches.

#### **5.1 Basic PSO algorithm**

Each potential solution is known as a "particle" within PSO and the location of the ith particle may be determined by *pi* <sup>¼</sup> *pij <sup>j</sup>*¼**1**, … ,*<sup>n</sup>* where *<sup>n</sup>* is the dimension of the search space. From now on, we suppose that we have a swarm *P* of *N* particle *p***1**, … *:*, *pN*.

During the search process, the particles update their positions using the motion equation:

$$\mathbf{p}\_{\mathbf{i}}^{\mathbf{t}+\mathbf{1}} = \mathbf{p}\_{\mathbf{i}}^{\mathbf{t}} + \mathbf{v}\_{\mathbf{i}}^{\mathbf{t}+\mathbf{1}} \tag{7}$$

The ith particle velocity is given by:

$$\mathbf{v}\_{\rm i}^{t+1} = \mathbf{v}\_{\rm i}^{t} + \mathbf{c}\_{1}(\mathbf{b}\mathbf{p}\_{\rm i} - \mathbf{p}\_{\rm i}^{t})\mathbf{r}\_{1} + \mathbf{c}\_{2}(\mathbf{b}\mathbf{g} - \mathbf{p}\_{\rm i}^{t})\mathbf{r}\_{2} \tag{8}$$

Such that *bpi* is the best position of the particle i, g is the global best position of the swarm members, *ck*, *k* ¼ 1, 2, is the acceleration parameters usually thoken from the interval [0 4] named also "cognitive coefficient", and *r<sup>k</sup>* ¼ diag uniform 0 1 ð Þ ð Þ ½ � , *k* ¼ 1, 2. The **Figure 4** illustrates the PSO formula used to update the particles positions (**Figure 5**).

**Figure 5.** *PSO learning equation illustration.*

The basic PSO pseudo-code can be the following:

1. Initialization. For each of the *N* particles:

	- a. Update the velocity using:

$$\mathbf{v\_i^{t+1}} = \mathbf{v\_i^t} + \mathbf{c\_1}(\mathbf{b}\mathbf{p\_i} - \mathbf{p\_i^t})\mathbf{r\_1} + \mathbf{c\_2}(\mathbf{b}\mathbf{g} - \mathbf{p\_i^t})\mathbf{r\_2} \tag{9}$$

b. Update the particle position using:

$$\mathbf{p\_i^{t+1}} = \mathbf{p\_i^t} + \mathbf{v\_i^{t+1}} \tag{10}$$

c. Evaluate the ith particle fitness *f p<sup>t</sup>*þ<sup>1</sup> *i* ;

$$\begin{aligned} \text{d. If} & f\left(p\_i^{t+1}\right) \ge f\left(p\_i^{t+1}\right); bp\_i = p\_i^{t+1} \\\\ \text{e. If} & f\left(p\_i^{t+1}\right) \ge b\mathbf{g}\right); b\mathbf{g} = p\_i^{t+1} \end{aligned}$$

3.At the convergence the best solution is *bg*.

#### **5.2 PSO meta parameters**

*Initialization:* PSO involves an initial estimate of the positions and velocities. For the initial positions, a general consensus is to cover the solution space on a uniform basis: *p*<sup>0</sup> *ij* � *U LBj*, *UBj* . For initial velocities, it is suggested to use a uniform distribution to ensure a uniform coverage of the search space. But this could augment the probability of particles being infeasible solutions. To defeat this inconvenience, the velocities may be set to zero or to very tiny arbitrary numbers.

*Acceleration constants:* The parameters *c*<sup>1</sup> and *c*<sup>2</sup> have a very large impact on the particle's paths and on the algorithm convergence. In this sense, the larger these constants are, the more the oscillation of the particle around the optimum increases, whereas very small values give rise to sinusoidal patterns. In general, it is recommended to set these parameters to 2 [24].

*Swarm size:* A large swarm size improves the variety of the swarm and its exploration ability, but in another way, it may also increase the risk of an early convergence and the calculation costs. Nevertheless, in most situations, it has actually been found that once the swarm size is higher than 50 particles, PSO becomes insensitive to the swarm size [24].
