**3.1.2 Genetic parameters**

404 Fuzzy Inference System – Theory and Applications

These algorithms are computationally very simple, are quite powerful. In addition, they are not limited by assumptions about the search space, for continuity, existence of derivatives,

The basic principle of genetic operators is to transform the population through successive generations, extending the search until you reach a satisfactory outcome. Genetic operators are needed to enable the population to diversify and keep adaptation characteristics

The operator mutation is necessary for the introduction and maintenance of genetic diversity of the population, arbitrarily changing one or more components of a structure chosen, as is illustrated in Figure 7, thus providing the means for introduction of new elements in the population. Thus, the mutation ensures that the probability of reaching any point in the search space will never be zero, in addition to circumvent the problem of local minima, because with this mechanism, slightly changes the search direction. The mutation operator is applied to individuals with a probability given by the mutation rate *Pm*; usually

The crossing is the operator responsible for the recombination of traits of parents during play, allowing future generations to inherit these traits. It is considered the predominant genetic operator, so it is applied with probability given by the crossover rate *Pc*, which must

a. One-point: a crossover point is chosen and from this point the parental genetic information will be exchanged. The information prior to this point in one of the parents is related to information subsequent to this point in the other parent, as shown in the

Fig. 8. Example of crossover from one-point: (a) two individuals are chosen; (b) a crossover point (2) is chosen; (c) the characteristics are recombined, generating two new individuals

b. Multi-points: is a generalization of this idea of an exchange of genetic material through

c. Uniform: don't use crossing points, but determines, through a global parameter, which

points, where many crossing points can be used.

the probability of each variable be exchanged between parents.

uses a small mutation rate, because it is a genetic operator secondary.

This operator can also be used in several ways; the most commonly used are:

and so on.

**3.1.1 Genetic operators** 

acquired by previous generations.

Fig. 7. Example of mutation

be greater than the rate of mutation.

example in Figure 8.

It is also important to analyze how some parameters influence the behavior of genetic algorithms in order to establish them as the needs of the problem and available resources.

**Population size.** The size of the population affects the overall performance and efficiency of AGs. With a small population performance may fall, because this way the population provides a small coverage of the search space of the problem. A large population typically provides a representative coverage of the problem domain, and preventing premature convergence solutions to local rather than global. However, for working with large populations, larger computational resources are required, or that the algorithm works by a much longer time period.

**Passing Rate.** The higher this ratio, the faster new structures will be introduced in the population. But if this is too high, the majority of the population will be replaced and can be lost high fitness structures. With a low value, the algorithm can become very slow.

**Mutation rate.** A low mutation rate prevents a given position stay stagnant in a value, and allow to reach anywhere in the search space. With a very high search becomes essentially random.

## **3.2 Particle Swarm Optimization**

The optimization method called Particle Swarm Optimization (PSO) as other meta-heuristics recently developed, simulates the behavior of systems making the analogy with social behaviors. PSO was originally inspired by biological partner behavior associated with group of birds (Goldberg, 1989). This topic will be discussed in more detail after the basic algorithm is described.

The PSO was first proposed by John Kennedy and Russell Eberhart (1995a, 1995b). Some of the interesting features of PSO include ease of implementation and the fact that no gradient information is required. It can be used to solve a range of different optimization problems, including most of the problems can be solved through genetic algorithms; one can cite as an example some of the applications, such as neural network training (Lee & El-Sharkawi, 2008) and to minimize various types of functions (Eberhart et al., 1996).

Many popular optimizations algorithms are deterministic, as the gradient-based algorithms. The PSO, like its similar, belonging to the family of Evolutionary Algorithm is an algorithm of stochastic type that needs no gradient information derived from error function. This allows the use of PSO in functions where the gradient is unavailable or the production of which is associated with a high computational cost.
