**5. Summary and conclusions**

This chapter presents a hybrid genetic, differential evolution algorithm that represents the set of parameters being optimized in a vector. The algorithm uses an elitist, ranking, random selection method to generate a mating pool. Candidate solutions from the mating pool are randomly selected for two differential evolution operators, and two recombination operators. The new candidate solutions generated by these operators are added to the mating pool. Candidate solutions from this expanded mating pool are selected randomly for Taguchi crossover.

To evaluate this algorithm's ability to solve optimization problems, the algorithm was applied to 13 commonly used global numerical optimization test functions, including a spherical, three hyper-ellipsoid, the sum of different powers, Rastrigin's, Schwefel's, Griewank's, Rosenbrock's valley, Styblinski-Tang, Ackley's Path, Price-Rosenbrock, and Eggholder's functions. The algorithm was evaluated using two and three level Taguchi crossover. For both two and three level Taguchi crossover, the algorithm converged below the specified *J min* for 100% of the 1000 runs for all the test functions. Although the three level Taguchi crossover algorithm typically converged using fewer iterations than the two level Taguchi crossover algorithm, the two level Taguchi crossover algorithm typically required fewer cost function evaluations than the three level Taguchi crossover algorithm.

Although this algorithm required significantly more iterations to converge for Eggholder's function and for 35-D Rosenbrock's valley function, ref. [19] shows that this algorithm has been successfully used to design digital infinite impulse response (IIR) filters with arbitrary magnitude responses. As a result, it can be expected that the simple optimization algorithm described in this chapter can be used successfully for similar engineering optimization applications.
