**6. References**

[1] John R. Koza, Martin A. Keane, Matthew J. Streeter, William Mydlowec, Jessen Yu, and Guido Lanza. *Genetic Programming IV Routine Human-Competitive Machine Intelligence*. Kluwer Academic Publishers, 2003.

[17] Christian B. Veenhuis. Tree based differential evolution. In *[32]*, pages 208–219, 2009. [18] Alberto Moraglio and Sara Silva. Geometric differential evolution on the space of genetic programs. In Anna Isabel Esparcia-Alcázar, Aniko Ekart, Sara Silva, Stephen Dignum, and A. Sima Uyar, editors, *Proceedings of the 13th European Conference on Genetic Programming, EuroGP 2010*, volume 6021 of *LNCS*, pages 171–183, Istanbul, 7-9 April

Continuous Schemes for Program Evolution 47

[19] Shumeet Baluja and Rich Caruana. Removing the genetics from the standard genetic algorithm. In *Proceedings of the 12th International Conference on Machine Learning*, pages

[20] A. Auger and N. Hansen. A restart CMA evolution strategy with increasing population size. In *Evolutionary Computation, 2005. The 2005 IEEE Congress on*, volume 2, pages 1769

[21] K. Price. Differential evolution: a fast and simple numerical optimizer. In *Biennial conference of the North American Fuzzy Information Processing Society*, pages 524–527, 1996. [22] S. Rahnamayan and P. Dieras. Efficiency competition on n-queen problem: DE vs. CMA-ES. In *Electrical and Computer Engineering, 2008. CCECE 2008. Canadian Conference*

[23] Carmen G. Moles, Pedro Mendes, and Julio R. Banga. Parameter Estimation in Biochemical Pathways: A Comparison of Global Optimization Methods. *Genome*

[24] N. Hansen and S. Kern. Evaluating the CMA evolution strategy on multimodal test functions. In X. Yao et al., editors, *Parallel Problem Solving from Nature PPSN VIII*, volume

[25] A. Auger and N. Hansen. A restart CMA evolution strategy with increasing population size. In *Evolutionary Computation, 2005. The 2005 IEEE Congress on*, volume 2, pages 1769

[26] Nikolaus Hansen and Stefan Kern. Evaluating the CMA evolution strategy on multimodal functions. In Springer-Verlag, editor, *Parallel Problem Solving from Nature,*

[27] Swagatam Das and Ponnuthurai Nagaratnam. Differential evolution: A survey of the

[28] Garnett Wilson and Wolfgang Banzhaf. A comparison of cartesian genetic programming and linear genetic programming. In Michael O'Neill, Leonardo Vanneschi, Steven Gustafson, Anna Esparcia Alcázar, Ivanoe De Falco, Antonio Della Cioppa, and Ernesto Tarantino, editors, *Genetic Programming*, volume 4971 of *Lecture Notes in Computer Science*,

[29] W. B. Langdon and R. Poli. Why ants are hard. Technical Report CSRP-98-4, University of Birmingham, School of Computer Science, January 1998. Presented at GP-98. [30] William B. Langdon. *Genetic Programming and Data Structures = Automatic Programming !*

[31] Maarten Keijzer, Andrea Tettamanzi, Pierre Collet, Jano van Hemert, and Marco Tomassini, editors. *8th European Conference, EuroGP 2005*, volume 3447 of *LNCS*,

state-of-the-art. *IEEE Transactions on Computers*, pages 4–31, feb 2011.

2010. Springer. Best paper.

– 1776 Vol. 2, September 2005.

*on*, pages 000033 –000036, May 2008.

3242 of *LNCS*, pages 282–291. Springer, 2004.

*PPSN VIII*, volume 3242 of *lncs*, pages 282–291, 2004.

pages 182–193. Springer Berlin / Heidelberg, 2008.

Kluwer Academic Publishers, 1998.

Lausanne, Switzerland, mar 2005.

*Research*, 13(11):2467–2474, 2003.

– 1776 Vol. 2, September 2005.

38–46, Morgan Kaufmann Publishers, 1995.


[17] Christian B. Veenhuis. Tree based differential evolution. In *[32]*, pages 208–219, 2009.

20 Will-be-set-by-IN-TECH

[1] John R. Koza, Martin A. Keane, Matthew J. Streeter, William Mydlowec, Jessen Yu, and Guido Lanza. *Genetic Programming IV Routine Human-Competitive Machine Intelligence*.

[2] Sameer H. Al-Sakram, John R. Koza, and Lee W. Jones. Automated re-invention of a previously patented optical lens system using genetic programming. In *[31]*, pages 25–37,

[3] Kun-Hong Liu and Chun-Gui Xu. A genetic programming-based approach to the classification of multiclass microarray datasets. *Bioinformatics*, 25(3):331–337, 2009. [4] Adrian Gepp and Phil Stocks. A review of procedures to evolve quantum algorithms.

[5] M. Szymanski, H. Worn, and J. Fischer. Investigating the effect of pruning on the diversity and fitness of robot controllers based on MDL2E during genetic programming.

[6] Markus Brameier and Wolfgang Banzhaf. *Linear Genetic Programming*. Genetic and

[7] H.A. Abbass, NX Hoai, and R.I. Mckay. AntTAG: A new method to compose computer programs using colonies of ants. In *The IEEE Congress on Evolutionary Computation*, pages

[8] Y. Shan, H. Abbass, RI McKay, and D. Essam. AntTAG: a further study. In *Proceedings of the Sixth Australia-Japan Joint Workshop on Intelligent and Evolutionary Systems, Australian*

[9] R. P. Salustowicz and J. Schmidhuber. Probabilistic incremental program evolution.

[10] Evandro Nunes Regolin and Aurora Trindad Ramirez Pozo. Bayesian automatic

[11] Rainer Storn and Kenneth Price. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. *Journal of Global Optimization*,

[12] Nikolaus Hansen and Andreas Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In *International*

[13] Michael O'Neill and Anthony Brabazon. Grammatical differential evolution. In *International Conference on Artificial Intelligence (ICAI'06)*, pages 231–236, Las Vegas,

[14] Michael O'Neill and Conor Ryan. Grammatical evolution. *ieeetec*, 5(4):349–357, aug 2001. [15] James E. Murphy, Michael O'Neill, and Hamish Carr. Exploring grammatical evolution

[16] Michael O'Neill and Conor Ryan. *Grammatical Evolution: Evolutionary Automatic*

*Genetic Programming and Evolvable Machines*, 10(2):181–228, 2009.

**Author details**

**6. References**

2005.

Cyril Fonlupt, Denis Robilliard, Virginie Marion-Poty

*LISIC, ULCO, Univ Lille Nord de France, France*

Kluwer Academic Publishers, 2003.

In *[33]*, pages 2780–2787, 2009.

1654–1659, 2002.

11(4):341–359, 1997.

Nevada, USA, 2006.

Evolutionary Computation. Springer, 2007.

*Evolutionary Computation*, 5(2):123–141, 1997.

programming. In *[31]*, pages 38–49, 2005.

*National University, Canberra, Australia*, volume 30, 2002.

*Conference on Evolutionary Computation*, pages 312–317, 1996.

for horse gait optimisation. In *[32]*, pages 183–194, 2009.

*Programming in an Arbitrary Language*. Kluwer Academic Press, 2003.

	- [32] Leonardo Vanneschi, Steven Gustafson, Alberto Moraglio, Ivanoe De Falco, and Marc Ebner, editors. *12th European Conference, EuroGP 2009*, volume 5481 of *LNCS*, Tubingen, Germany, apr 2009.

**Programming with Annotated**

Additional information is available at the end of the chapter

cited.

Evolutionary algorithms (EAs) mimic natural evolution to solve optimization problems. Because EAs do not require detailed assumptions, they can be applied to many real-world problems. In EAs, solution candidates are evolved using genetic operators such as crossover and mutation which are analogs to natural evolution. In recent years, EAs have been considered from the viewpoint of distribution estimation, with estimation of distribution algorithms (EDAs) attracting much attention ([14]). Although genetic operators in EAs are inspired by natural evolution, EAs can also be considered as algorithms that sample solution candidates from distributions of promising solutions. Since these distributions are generally unknown, approximation schemes are applied to perform the sampling. Genetic algorithms (GAs) and genetic programmings (GPs) approximate the sampling by randomly changing the promising solutions via genetic operators (mutation and crossover). In contrast, EDAs assume that the distributions of promising solutions can be expressed by parametric models, and they perform model learning and sampling from the learnt models repeatedly. Although GA-type sampling (mutation or crossover) is easy to perform, it has the disadvantage that GA-type sampling is valid only for the case where two structurally similar individuals have similar fitness values (e.g. the one-max problem). GA and GP have shown poor search performance in deceptive problems ([6]) where the condition above is not satisfied. However, EDAs have been reported to show much better search performance for some problems that GA and GP do not handle well. As in GAs, EDAs usually employ fixed length linear arrays to represent solution candidates (these EDAs are referred to as GA-EDAs in the present chapter). This decade, EDAs have been extended so as to handle programs and functions having tree structures (we refer to these as GP-EDAs in the present chapter). Since tree structures have different node number, the model learning is much more difficult than that of GA-EDAs. From the viewpoint of modeling types, GP-EDAs can be broadly classified into two groups: probabilistic proto-type tree (PPT) based methods and probabilistic context-free grammar (PCFG) based methods. PPT-based methods employ techniques devised in GA-EDAs by transforming variable length tree structures into fixed length linear arrays. PCFG-based methods employ

**Chapter 3**

©2012 Hasegawa, licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0),which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly

©2012 Hasegawa, licensee InTech. This is a paper distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

**Grammar Estimation**

Yoshihiko Hasegawa

**1. Introduction**

http://dx.doi.org/10.5772/51662

[33] *Congress on Evolutionary Computation*, Trondheim, Norway, may 2009.

48 Genetic Programming – New Approaches and Successful Applications **Chapter 0 Chapter 3**
