**4. Classification of Genetic Algorithms**

Sometimes the cost function is extremely complicated and time-consuming to evaluate. As a result some care must be taken to minimize the number of cost function evaluations. An idea was to use parallel execution of various Simple GAs, and these algorithms are called Parallel Genetic Algorithms (PGAs). PGAs have been developed to reduce the large execution times that are associated with simple genetic algorithms for finding near-optimal solutions in large search spaces. They have also been used to solve larger problems and to find better solutions. PGAs have considerable gains in terms of performance and scalability. There are a lot of methods of PGAs (Independent PGA, Migration PGA, Partition PGA, Segmentation PGA) which are fully described in (Sivanandam & Deepa, 2008).

Hybrid Genetic Algorithms (HGAs) produce another important class of GAs. A hybrid GA combines the power of the GA with the speed of a local optimizer. The GA excels at gravitating toward the global minimum. It is not especially fast at finding the minimum when in a locally quadratic region. Thus the GA finds the region of the optimum, and then the local optimizer takes over to find the minimum. Some examples of HGAs used in Digital Electronics Design will be presented in the next section.

Adaptive genetic algorithms (AGAs) are GAs whose parameters, such as the population size, the crossing over probability, or the mutation probability are varied while the GA is running. "The mutation rate may be changed according to changes in the population; the longer the population does not improve, the higher the mutation rate is chosen. Vice versa, it is decreased again as soon as an improvement of the population occurs" (Sivanandam & Deepa, 2008).

Fast Messy Genetic Algorithm (FmGA) is a binary, stochastic, variable string length, population based approach to solving optimization problems. The main difference between the FmGA and other genetic approaches is the ability of the FmGA to explicitly manipulate building blocks (BBs) of genetic material in order to obtain good solutions and potentially the global optimum. Some works, like (Haupt & Haupt, 2004), use only the term of Messy Genetic Algorihms (mGAs).

Finally, Independent Sampling Genetic Algorithm (ISGA) are more robust GAs, which manipulate building blocks to avoid the premature convergence in a GA. Implicit parallelism and the efficacy of crossover are enhanced and the ISGAs have been shown to outperform several different GAs (Sivanandam & Deepa, 2008). Other classes of efficient GAs may be implemented for different specific applications.
