**2. Genetic Algorithm description**

Once the problem encoding and the fitness functions have been chosen, the evolution process begin. To evolve new solutions, an initial population of encoded solutions is created randomly or using some problem-specific knowledge. This population is subjected to genetic operators to create new promising solutions.

A typical genetic algorithm starts with a randomly generated population composed by genes, locus, allele, chromosome, genotype, variables and phenotype (Holland, 1975, Goldberg, 1989, Michalewicz, 1999, Coello-Coello, 2007), figure 1.

Fig. 1. Chromosome binary representation.

Individuals are probabilistically selected by evaluating the objective function. This gene has converged when at least 95% of individuals in the population share the same value of that genes. The population converges when all the genes have converged.

Different operators exist in GA´s, being the most popular (1) *selection*, (2) *crossover*, and (3) *mutation*, The steps to make a genetic algorithm, as defined in (Goldberg, 1989), are shown in the diagram of figure 2.

*Initial Population* is created randomly and it is encoded within the chromosome of an array with variable length. The coding can be done in a binary representation (Goldberg, 1989), based on the domain of each variable (figure 3).

In the *decodification* is necessary to have a representation of the genotype to assign the parameters within a chain of symbols known as genes. The *evaluation* uses the fitness function that reflect the value of the individual in terms of the real value of the variable in the problem's domain, but in many optimization combinatorial cases, where a great amount of restrictions exists; there is a probability in which part of the points of the search space represents no valid individuals. For example, the equation for the synthesis of planar mechanisms are:

$$F = \left(\mathbf{C}\_{xd}^{\intercal}\left(\upsilon\right) - \mathbf{C}\_{xy}^{\intercal}\left(\upsilon\right)\right)^{2} + \left(\mathbf{C}\_{yd}^{\intercal}\left(\upsilon\right) - \mathbf{C}\_{yy}^{\intercal}\left(\upsilon\right)\right)^{2} \tag{1}$$

44 Bio-Inspired Computational Algorithms and Their Applications

(Ramírez-Gordillo, 2010, Lugo González, 2010), for optimizing the trajectory generation in closed chain mechanisms and planning the effects that it has on the mechanism by relaxing some parameters. The objective is to show the behavior of relaxing the parameters of the GA's, observing what advantages and disadvantages appear when varying some parameter

Once the problem encoding and the fitness functions have been chosen, the evolution process begin. To evolve new solutions, an initial population of encoded solutions is created randomly or using some problem-specific knowledge. This population is subjected to

A typical genetic algorithm starts with a randomly generated population composed by genes, locus, allele, chromosome, genotype, variables and phenotype (Holland, 1975,

Individuals are probabilistically selected by evaluating the objective function. This gene has converged when at least 95% of individuals in the population share the same value of that

Different operators exist in GA´s, being the most popular (1) *selection*, (2) *crossover*, and (3) *mutation*, The steps to make a genetic algorithm, as defined in (Goldberg, 1989), are shown

*Initial Population* is created randomly and it is encoded within the chromosome of an array with variable length. The coding can be done in a binary representation (Goldberg, 1989),

In the *decodification* is necessary to have a representation of the genotype to assign the parameters within a chain of symbols known as genes. The *evaluation* uses the fitness function that reflect the value of the individual in terms of the real value of the variable in the problem's domain, but in many optimization combinatorial cases, where a great amount of restrictions exists; there is a probability in which part of the points of the search space represents no valid individuals. For example, the equation for the synthesis of planar

( ) () () ( ) () () 2 2 *ii ii F CvCv CvCv* =− +− *xd xg yd yg* (1)

exceeding the recommended values established in the literature.

Goldberg, 1989, Michalewicz, 1999, Coello-Coello, 2007), figure 1.

genes. The population converges when all the genes have converged.

**2. Genetic Algorithm description** 

Fig. 1. Chromosome binary representation.

based on the domain of each variable (figure 3).

in the diagram of figure 2.

mechanisms are:

genetic operators to create new promising solutions.

Where *<sup>i</sup> Cxd* is a set of specific points indicated by the designer and *<sup>i</sup> Cxg* are the points generated by the coupler of the mechanism, and *v = r1, r2, r3, r4, rcx, rcy,*θ*0, x0, y0*, the angles 1 2 22 2 , , *<sup>N</sup>* θθ θ are values for the variable θ*<sup>2</sup>*, *i* is the rest of the quotient. The genetic algorithm maximizes solely, but the minimization can be made easily using the reciprocal of the function to avoid singularity problems (2):

$$fitnessoptimum = \frac{1}{fitness} \tag{2}$$

In order to improve the results, approaches such as elitism, regeneration stages and the forced inheritance mechanism can be inserted in the process of the algorithms:

Fig. 2. Flowchart of genetic algorithms.

Performance of Simple Genetic Algorithm

to alter the number of individuals.

generations.

function.

chromosome.

4. This population is converted to binary representation.

Fig. 4. Forced Inheritance Mechanism (Merchán-Cruz, 2005).

cannot perform this role as well as crossover.

determined without altering the number of individuals.

Inserting Forced Inheritance Mechanism and Parameters Relaxation 47

3. Regeneration takes the value from the individuals by the percentage to be regenerated.

5. The position that will occupy the regenerated ones in the original population is

7. The best individual in the regenerated population introduces itself, looking forward not

Following the development of the genetic algorithm, taking the best individuals from the population will pay the selection of those who have been outfitted as parents of the new

For the Parent Selection exists several techniques, but the most used is the proportional selection proposed by (Goldberg, 1989) in this each individual have a probability of being selected like parents, that is proportional to the value estimated by means of the objective

Crossover is based on taking two individuals correctly adapted to obtain descendants that share genes of both. There are several types of crossover mechanisms that are used depending on the scheme that is analyzed. According to (Kuri-Morales and Galaviz-Casas,

*Mutation* is an operator that is applied with probability *pm* and has the effect to invert a single bit using a probability of mutation of bit *l -1*, being *l* the length of the chain of the

While crossover needs large populations to effectively combine the necessary information, mutation works best when applied to small populations during a large number of generations. Mutation is usually a secondary search operator which performs a random search locally around a solution and therefore has received far less attention. However, in evolutionary strategies where crossover is the primary search operator, significant attention has been paid to the development of mutation operators. Several mutation operators, including adaptive techniques, have been proposed by (Lima, 2005). Clearly, mutation

2002) the most popular are: single point crossover, two points and uniform crossover.

6. Reinsert the regenerated population into a sector of the original population.

Domain=[ -60 60 -60 60 0 60 ……………0 360]

[X0min X0max Y0min Y0max…………. θminθmax]

Fig. 3. Structure Chromosome.

*Elitism:* In this case the best individual of the population at a certain time is selected like father, this reserve two slots in the next generation for the highest scoring chromosome of the current generation, without allowing that chromosome to be crossed over in the next generation. One of those slots, the elite chromosome will also not be subject to mutation in the next generation.

*Regeneration Mechanism***:** The investigations on some alive organisms that use strategies for their renovation in physiological conditions or before a damage, demonstrate the possibility of incorporating cells that appear and which are specialized in providing reserve cells of an adult organism, thanks to a particular hereditary mechanism and, under this condition, the algorithm can be considered like an evolutionary process within the population. Therefore, a small percentage of the population can be renewed, which allows increasing the formation of construction blocks with better possibilities of finding an optimal value, but as inconvenient the problem of premature convergence of an evolutionary algorithm explained by (Hidalgo and Lanchares, 2000) and (Wen-Jyi et al., 2003) is presented. Nevertheless, the biological evolution process and its mimetization, can validate the use of a regeneration factor and its fundamental preservation in the genetic operators of selection, crosses and mutation.

*Forced Inheritance Mechanism*: Proposed by (Merchán-Cruz, 2005), is a complementary part of the regeneration mechanism as a strategy to introduce specialized chromosomes on the basis of the elitism during the crossing process and mutation. Unlike elitism, where the aptest individuals of a population pass to the following generation without no alteration, the FIM is introduced in the process of regeneration, selection, crossover and mutation, guaranteeing that the aptest individual of the previous generation undergoes a minimal change increasing its aptitude value of consistent method. This mechanism is very useful when the number of variables to solve in the problem is considerably large.

In the same way that the best obtained chromosome is carried among generations in a simple GA, the best set of chromosomes is also carried to the GA search for the next trajectory parameters. By introducing the best set of chromosomes from the previous trajectory segment of the initial population of the current GA search, the required number of generations to produce a new trajectory segment is reduced, provided that the trajectory is stable in that particular instant, since the optimum or the near optimum solution is already coded into the initial population. If the mechanism has to change its trajectory due to kinematic constrains or any other circumstance, the carried set of chromosomes does not affect the search for a new optimum set since this one is evaluated and ranked accordingly to its corresponding fitness. Figure 4 illustrates this mechanism called Forced Inheritance Mechanism, FIM, (Merchán-Cruz, 2005).

The necessary operations for regeneration and the forced inheritance are:


46 Bio-Inspired Computational Algorithms and Their Applications

Domain=[ -60 60 -60 60 0 60 ……………0 360]

[X0min X0max Y0min Y0max…………. θminθmax]

*Elitism:* In this case the best individual of the population at a certain time is selected like father, this reserve two slots in the next generation for the highest scoring chromosome of the current generation, without allowing that chromosome to be crossed over in the next generation. One of those slots, the elite chromosome will also not be subject to mutation in

*Regeneration Mechanism***:** The investigations on some alive organisms that use strategies for their renovation in physiological conditions or before a damage, demonstrate the possibility of incorporating cells that appear and which are specialized in providing reserve cells of an adult organism, thanks to a particular hereditary mechanism and, under this condition, the algorithm can be considered like an evolutionary process within the population. Therefore, a small percentage of the population can be renewed, which allows increasing the formation of construction blocks with better possibilities of finding an optimal value, but as inconvenient the problem of premature convergence of an evolutionary algorithm explained by (Hidalgo and Lanchares, 2000) and (Wen-Jyi et al., 2003) is presented. Nevertheless, the biological evolution process and its mimetization, can validate the use of a regeneration factor and its

fundamental preservation in the genetic operators of selection, crosses and mutation.

when the number of variables to solve in the problem is considerably large.

The necessary operations for regeneration and the forced inheritance are:

Mechanism, FIM, (Merchán-Cruz, 2005).

size of the population.

1. Percentage of the population to regenerate.

*Forced Inheritance Mechanism*: Proposed by (Merchán-Cruz, 2005), is a complementary part of the regeneration mechanism as a strategy to introduce specialized chromosomes on the basis of the elitism during the crossing process and mutation. Unlike elitism, where the aptest individuals of a population pass to the following generation without no alteration, the FIM is introduced in the process of regeneration, selection, crossover and mutation, guaranteeing that the aptest individual of the previous generation undergoes a minimal change increasing its aptitude value of consistent method. This mechanism is very useful

In the same way that the best obtained chromosome is carried among generations in a simple GA, the best set of chromosomes is also carried to the GA search for the next trajectory parameters. By introducing the best set of chromosomes from the previous trajectory segment of the initial population of the current GA search, the required number of generations to produce a new trajectory segment is reduced, provided that the trajectory is stable in that particular instant, since the optimum or the near optimum solution is already coded into the initial population. If the mechanism has to change its trajectory due to kinematic constrains or any other circumstance, the carried set of chromosomes does not affect the search for a new optimum set since this one is evaluated and ranked accordingly to its corresponding fitness. Figure 4 illustrates this mechanism called Forced Inheritance

2. Chose again the number of individuals, the length of the chromosome and therefore the

Fig. 3. Structure Chromosome.

the next generation.


Following the development of the genetic algorithm, taking the best individuals from the population will pay the selection of those who have been outfitted as parents of the new generations.

Fig. 4. Forced Inheritance Mechanism (Merchán-Cruz, 2005).

For the Parent Selection exists several techniques, but the most used is the proportional selection proposed by (Goldberg, 1989) in this each individual have a probability of being selected like parents, that is proportional to the value estimated by means of the objective function.

Crossover is based on taking two individuals correctly adapted to obtain descendants that share genes of both. There are several types of crossover mechanisms that are used depending on the scheme that is analyzed. According to (Kuri-Morales and Galaviz-Casas, 2002) the most popular are: single point crossover, two points and uniform crossover.

*Mutation* is an operator that is applied with probability *pm* and has the effect to invert a single bit using a probability of mutation of bit *l -1*, being *l* the length of the chain of the chromosome.

While crossover needs large populations to effectively combine the necessary information, mutation works best when applied to small populations during a large number of generations. Mutation is usually a secondary search operator which performs a random search locally around a solution and therefore has received far less attention. However, in evolutionary strategies where crossover is the primary search operator, significant attention has been paid to the development of mutation operators. Several mutation operators, including adaptive techniques, have been proposed by (Lima, 2005). Clearly, mutation cannot perform this role as well as crossover.

Performance of Simple Genetic Algorithm

Inserting Forced Inheritance Mechanism and Parameters Relaxation 49

1. **Parallelization:** GAs are executed on several processors and the computational load is distributed among these Processors (Cantu-Paz, 2000). This leads to significant speedup when solving large scale problems. Parallelization can be achieved through different ways. A simple way is to have part of the GA operations such as evaluation running simultaneously on multiple processors (Bethke, 1976). Another way is to create several subpopulations and allow them evolve separately at the same time, while spreading

2. **Hybridization:** Local search methods or domain-specific knowledge is coupled with GA. This are powerful in global search. However, they are not as efficient as local search methods in reaching the optimum on micro-scale. Therefore, hybridization which incorporates local search methods into GA will facilitate local convergence. A common form of hybridization is to apply a local search operator to each member of the

3. **Time Continuity**: The capabilities of both mutation and recombination are utilized to obtain a solution of as high quality as possible with a given limited computational resource (Srivastava, 2002). Time continuation exploits the tradeoff between the search for solutions with a large population and a single convergence epoch or using a small

4. **Relaxation Evaluation**: An accurate, but computationally expensive fitness evaluation is replaced with a less accurate, but computationally inexpensive fitness estimate. The low-cost, less-accurate fitness estimate can either be 1) exogenous, as in the case of surrogate (or approximate) fitness functions, where external means that it can be used to develop the fitness estimate; or 2) endogenous, as in the case of fitness inheritance (Smith, 1995) where the fitness estimate is computed internally and is based on parental

Some authors such as (Holland, 1975), have looked into the effect of varying GA's parameters which have to be taken into account to exploit the full potential in particular applications. Accordingly to this, for a search algorithm to perform well online, one has to decide quickly which are the most promising search regions in order to concentrate the search efforts there, the off-line performance does not penalize the search algorithm to explore poor regions of the search space, provided that this will help to achieve the best possible solutions (in terms of fitness), abig generation interval and the use of an elitist strategy also improve the performance of the GA's, in which the usual recommended mutation rates between 0.001 and 0.01 for the binary representation (Goldberg, 1989), or in

good solutions across the subpopulations (Grosso, 1985).

population after each generation in GA (Sinha, 2002).

**3. Adjustment in the performance of the parameters of the GA** 

general, much smaller value of the crossover probability (Cabrera et al., 2002).

The main parameters that can be adjusted, by the degree of importance within the GA are:

The design of the algorithm is limited to choose and determine the degree of control or the strategies of parameters such as the ranges and the likelihood of a mutation, crossing and extent of the population. (Sanchéz-Marín, 2000) supported their research in the

population with multiple convergence epochs.

fitness.

• Population size • Percentage of crosses • Percentage of mutation

By other hand *Crossover Probability* indicates how often will be crossover performed. If there is no crossover, offspring is an exact copy of parents. If there is a crossover, offspring is made from parts of parent's chromosome. If crossover probability is 100%, then all offspring is made by crossover. If it is 0%, a whole new generation is made from exact copies of chromosomes from old population (but this does not mean that the new generation is the same). Crossover is made expecting that new chromosomes will have good parts of old chromosomes and perhaps this will be better. However it is good to allow some part of the population survive to next generation.

*Mutation probability* says how often will be parts of chromosome mutated. If there is no mutation, the offspring is taken after crossover (or copy) without any change. If mutation is performed, part of a chromosome is changed. If mutation probability is *100%,* whole chromosome is changed, if it is 0%, nothing is changed. Mutation is made to prevent falling GA into local extreme, but it should not occur very often, because then GA will in fact change to *random search.*

Each operator allows that the evolutionary process progress toward promising regions in the area of search and can carry on diversity within the population and inhibit the premature convergence to an optimal local by means of new individuals sampled randomly. On the other hand is required to manipulate the information through a metric that quantifies the evolutionary process, this can be done through the design of a function that gets the more suitable individuals. This metric is known as a function of ability and it increases the ability of this individual to operate with a good performance and to get an unbeatable quality.

Problems typically contain restrictions, such as the non-linearity and inequality, which makes necessary to incorporate information on the violation of restrictions on some of the functions and the most known are *Criminalization role,* this restricts the fitness role by extending its domain by a factor of criminalization to any restriction raped. It can penalize for not being feasible or to make feasible an individual. The penalty function design must take into account how distant is an individual from the feasible area, the cost of fulfillment and the cost of expected compliance. Some of these penalties are:


The adaptive criminalization is used in this work.

### **2.1 Efficiency enhancement of GA**

Goldberg categorized the efficiency enhancement techniques of GA into four broad classes: parallelization, hybridization, time continuation, and evaluation relaxation (Goldberg, 2002): 48 Bio-Inspired Computational Algorithms and Their Applications

By other hand *Crossover Probability* indicates how often will be crossover performed. If there is no crossover, offspring is an exact copy of parents. If there is a crossover, offspring is made from parts of parent's chromosome. If crossover probability is 100%, then all offspring is made by crossover. If it is 0%, a whole new generation is made from exact copies of chromosomes from old population (but this does not mean that the new generation is the same). Crossover is made expecting that new chromosomes will have good parts of old chromosomes and perhaps this will be better. However it is good to allow some part of the

*Mutation probability* says how often will be parts of chromosome mutated. If there is no mutation, the offspring is taken after crossover (or copy) without any change. If mutation is performed, part of a chromosome is changed. If mutation probability is *100%,* whole chromosome is changed, if it is 0%, nothing is changed. Mutation is made to prevent falling GA into local extreme, but it should not occur very often, because then GA will in fact

Each operator allows that the evolutionary process progress toward promising regions in the area of search and can carry on diversity within the population and inhibit the premature convergence to an optimal local by means of new individuals sampled randomly. On the other hand is required to manipulate the information through a metric that quantifies the evolutionary process, this can be done through the design of a function that gets the more suitable individuals. This metric is known as a function of ability and it increases the ability of this individual to operate with a good performance and to get an

Problems typically contain restrictions, such as the non-linearity and inequality, which makes necessary to incorporate information on the violation of restrictions on some of the functions and the most known are *Criminalization role,* this restricts the fitness role by extending its domain by a factor of criminalization to any restriction raped. It can penalize for not being feasible or to make feasible an individual. The penalty function design must take into account how distant is an individual from the feasible area, the cost of fulfillment

• **Death Penalty.** It assigns a suitability of zero to the individual not feasible, avoiding calculate again restrictions or objective function. However, the algorithm may be

• **Static Criminalization.** It defines levels of violation and chooses a coefficient of

• **Dynamic Criminalization.** The factors of criminalization change with time; they are susceptible to the values of the parameters and converge prematurely when these are

Goldberg categorized the efficiency enhancement techniques of GA into four broad classes: parallelization, hybridization, time continuation, and evaluation relaxation (Goldberg, 2002):

• **Adaptive Criminalization:** Adjusting the penalty on the basis of a feedback process.

truncated if the initial population does not contain any feasible individual.

and the cost of expected compliance. Some of these penalties are:

violation to each one of them.

**2.1 Efficiency enhancement of GA** 

The adaptive criminalization is used in this work.

not selected properly.

population survive to next generation.

change to *random search.*

unbeatable quality.

