**4. Optimization test function results**

To evaluate this algorithm's ability to solve optimization problems, the algorithm was applied to 13 commonly used global numerical optimization test functions. **Table 1** lists these 13 cost functions, *J*1ð Þ **x** through *J*13ð Þ **x** , where **x** ¼


#### **Table 1.**

*Optimization test functions and their solution spaces.*

½ � *x*<sup>1</sup> *x*<sup>2</sup> … *xN <sup>T</sup>*, and their solution spaces. These functions include a spherical, three hyper-ellipsoid, the sum of different powers, Rastrigin's, Schwefel's, Griewank's, Rosenbrock's valley, Styblinski-Tang, Ackley's Path, Price-Rosenbrock, and Eggholder's functions. The first 11 functions, *J*1ð Þ **x** through *J*11ð Þ **x** , are multidimensional functions and are tested for two dimensions (*N* ¼ 2) and 35 dimensions (*N* ¼ 35). Functions *J*12ð Þ **x** and *J*13ð Þ **x** are two dimensional functions and were only tested for *N* ¼ 2. For all 13 functions, *J min* ¼ 0 where *J min* is the global minimum value of the cost function. **Figure** 2 shows a plot of the two dimensional Schwefel function and **Figure 3** shows a plot of the two dimensional Eggholder function.

For all functions and dimensions, the initial population, *K*ð Þ 0 , was set to 50, *η*� ¼ 0*:*9, *PDE*1ð Þ¼ *n* 0*:*16, *PDE*2ð Þ¼ *n* 0*:*2, *PTc*ð Þ¼ *n* 0*:*22 and *Pcr*ð Þ *n* was set each iteration according to Eq. (18). The algorithm was applied 1000 times to each function, and the algorithm was assumed to converge when a solution, **x***opt*, was determined so that *J* **x***opt* � �≤ *Jtol* where *Jtol* for each function is listed in **Table 2**. **Table 2** lists the mean and standard deviation of the number of iterations that the algorithm required to converge for two level (*L* ¼ 2) and three level (*L* ¼ 3) Taguchi crossover for each function. **Table 2** also lists the means and standard deviations of the number of cost function evaluations that the algorithm required to converge for two level (*L* ¼ 2) and three level (*L* ¼ 3) Taguchi crossover for each function.

The number of cost function evaluations that the algorithm required to converge can also be calculated as a function of the algorithm's average population size, average mating pool size and operator probabilities. For example, 2-D functions require four

*A Hybrid Genetic, Differential Evolution Optimization Algorithm DOI: http://dx.doi.org/10.5772/intechopen.106204*

**Figure 2.** *A plot of the two dimensional Schwefel's function.*

**Figure 3.** *A plot of the two dimensional Eggholder's function.*


**Table 2.**

*Results for the optimization of the 2-D and 35-D test functions where K*ð Þ¼ 0 50*, η*� ¼ 0*:*9*, PDE*<sup>1</sup> ¼ 0*:*16*, PDE*<sup>2</sup> ¼ 0*:*2 *and PTc* ¼ 0*:*22 *for two level (L* ¼ 2*) and three level (L* ¼ 3*) Taguchi crossover. Results are averages over 1000 runs.*

cost function evaluations for two level Taguchi crossover and nine cost function evaluations for three level Taguchi crossover. Therefore, the average number of cost function evaluations per algorithm iteration is

$$\left[\overline{K} + 4 + 4P\_{Tc}\right] \left[\left(\overline{M} - \mathbf{1}\right)\left(\mathbf{1} + \overline{P}\_{3}\right) + 4\right] \tag{27}$$

for two level Taguchi crossover and

*A Hybrid Genetic, Differential Evolution Optimization Algorithm DOI: http://dx.doi.org/10.5772/intechopen.106204*

$$
\overline{K} + \mathfrak{H} + \mathfrak{P}P\_{T\mathfrak{e}} \left[ \left( \overline{M} - \mathfrak{1} \right) \left( \mathfrak{1} + \overline{P}\_{\mathfrak{3}} \right) + \mathfrak{4} \right] \tag{28}
$$

for three level Taguchi crossover where *K* is the average population size, *M* is the average mating pool size after selection and *P*<sup>3</sup> is the average of *P*3ð Þ *n* .

Similarly, for the 35-D functions, two level Taguchi crossover requires 40 cost function evaluations, and three level Taguchi crossover requires 81 cost function evaluations. Therefore, the average number of cost function evaluations per algorithm iteration is

$$
\overline{K} + 4\mathbf{0} + 4\mathbf{0} \mathbf{P}\_{\overline{L}} \left[ \left( \overline{M} - \mathbf{1} \right) \left( \mathbf{1} + \overline{P}\_{\overline{3}} \right) + 4 \right] \tag{29}
$$

for two level Taguchi crossover and

$$\left[\overline{K} + \mathbf{81} + \mathbf{81}P\_{Tc}\left[\left(\overline{M} - \mathbf{1}\right)\left(\mathbf{1} + \overline{P}\_3\right) + \mathbf{4}\right] \tag{30}$$

for three level Taguchi crossover.

Although no algorithm can solve all types of optimization problems [8, 9], the data in **Table 2** shows that the algorithm converged below the specified *J min* for 100% of the 1000 runs for all the test functions. The data in **Table 2** also shows that this algorithm requires significantly more iterations to converge for Eggholder's function, *J*13, and for Rosenbrock's valley, *J*<sup>9</sup> when *N* ¼ 35 which implies that the algorithm has not been optimized for all types of cost functions. The data also shows that although the three level Taguchi crossover algorithm typically converges using few iterations than the two level Taguchi crossover algorithm, the two level Taguchi crossover algorithm typically requires fewer cost function evaluations than the three level Taguchi crossover algorithm.
