**Recent Development of Genetic Algorithm**

**1** 

**and Its Applications** 

*Republic of Korea* 

Young-Doo Kwon1 and Dae-Suep Lee2

**The Successive Zooming Genetic Algorithm** 

*1School of Mechanical Engineering & IEDT, Kyungpook National University,* 

Optimization techniques range widely from the early gradient techniques 1 to the latest random techniques 16, 18, 19 including ant colony optimization 13, 17. Gradient techniques are very powerful when applied to smooth well-behaved objective functions, and especially, when applied to a monotonic function with a single optimum. They encounter certain difficulties in problems with multi optima and in those having a sharp gradient, such as a problem with constraint or jump. The solution may converge to a local optimum, or not

To remedy these difficulties, several different techniques based on random searching have been developed: full random methods, simulated annealing methods, and genetic algorithms. The full random methods like the Monte Calro method are perfectly global but exhibit very slow convergence. The simulated annealing methods are modified versions of the hill-climbing technique; they have enhanced global search ability but they too have slow

Genetic algorithms 2-5 have good global search ability with relatively fast convergence rate. The global search ability is relevant to the crossover and mutations of chromosomes of the reproduced pool. Fast convergence is relevant to the selection that takes into account the fitness by the roulette or tournament operation. Micro-GA 3 does not need to adopt mutation, for it introduces completely new individuals in the mating pool that have no relation to the evolved similar individuals. The pool size is smaller than that used by the

Versatile genetic algorithms have some difficulty in identifying the optimal solution that is correct up to several significant digits. They can quickly approach to the vicinity of the global optimum, but thereafter, march too slowly to it in many cases. To enhance the convergence rate, hybrid methods have been developed. A typical one obtains a rough optimum using the GA first, and then approaches the exact optimum by using a gradient method. Other one finds the rough optimum using the GA first, and then searches for the exact optimum by using the GA again in a local domain selected based on certain logic 7.

The SZGA (Successive Zooming Genetic Algorithm) 6, 8-12 zooms the search domain for a specified number of steps to obtain the optimal solution. The tentative optimum solutions

simple GA , which needs a big pool to generate a variety of individuals.

**1. Introduction** 

convergence rates.

converge to any optimum but diverge near a jump.

*2Division of Mechanical Engineering, Yeungjin College, Daegu,* 
