**1. Introduction**

186 Bio-Inspired Computational Algorithms and Their Applications

Goldberg D. E. (1989) *Genetic Algorithms in Search, Optimization and Machine Learning*.

Gregorio-Hetem J., Lépine J. R. D., Quast G. R., Torres C. A. O., de la Reza R. (1992) AJ, 103,

Harrington, S. (2003) *Pistonless Dual Chamber Rocket Fuel Pump*, 39th

Hetem, A., & Gregorio-Hetem, J. (2007) The use of genetic algorithms to model

Hetem Jr, A. ; Gregorio-Hetem, J. (2009) *The use of Genetic Algorithms and Spectral Synthesis in* 

Paris - France : International Astronomical Union, 2009. v. 1. p. 481-481. Huzel, D.K. & Huang, D.H. (1967) *Design of Liquid Propellant Rocket Engines*, Rocketdyne

Koza J. R. (1994) *Genetic Programming II: Automatic Discovery of Reusable Programs*. MIT Press. Kurucz, R. L. (1993) CD-ROM 13, *Atlas9 Stellar Atmosphere Programs and 2 km/s Grid*

Logan, E., Jr., & Roy, R, (eds) (2003) H*andbook of Turbomachinery (Second Edition Revised and* 

Neufeld, M. J. (1995) *The Rocket and the Reich.* The Smithsonian Institution. pp. 80–1, 156, 172.

Papadimitriou, C. H. (1995) *Computational Complexity*. Addison-Wesley, Reading

Press W. H., Teukolsky S. A., Vetterling W. T., Flannery B. P. (1995) *Numerical Recipes in C*,

Sutton, G.P., & Biblarz, O. (2001) *Rocket Propulsion Elements* 7th editon, JOHN WILEY &

Sutton, G.P. (1986) *Rocket Propulsion Elements an Introduction to Engineering of Rockets*, John

Torres C. A. O. (1998) *Publicação Especial do Observatório Nacional*, No. 10/99. Observatório

Torres C. A. O., Quast G. R., de la Reza R., Gregorio-Hetem J., Lépine J. R. D. (1995) AJ, 109,

van den Ancker M. E., Meeus G., Cami J., Waters L. B. F. M.,Waelkens C. (2001) A&A, 369,

Rojas, G., Gregorio-Hetem, J., Hetem, A. (2008) MNRAS, 387, Issue 3, pp. 1335-1343.

AIAA/ASME/SAE/ASEE Hoint Propulsion Conference and Exhibit. AIAA 2003-

protoplanetary discs, MNRAS 382, 1707–1718 (2007) doi:10.1111/j.1365-

*the Calculantion of Abundances and Metallicities of T Tauri stars*. In: Young stars, Brown Dwarfs and Protoplanetary Disks Special Session 7 - IAU XXVII General Assembly, 2009, Rio de Janeiro - RJ. IAU XXVII General Assembly Abstract Book.

Addison-Wesley Longman, Boston, MA

Division, North American Aviation, Inc.

(Cambridge: Smithsonian Astrophys. Obs.)

2nd edn. Cambridge Univ. Press, New York

Wilking B. A., Lada C. J., Young E. T. (1989) ApJ, 340, 823

Young, W. C. (1989) *Roark's formulas for stress and strain*, McGraw-Hill

Lada C. J., Wilking B. A. (1984) ApJ, 287, 610

ISBN 0-674-77650-X.

Massachusetts.

SONS, INC.

Wiley & Sons.

2146

217

Nacional, Rio de Janeiro

*Expanded)*, Marcel Dekker, Inc. Luc, P. & Gerstenkorn, S. (1972) AA, 18, 209

Grevesse, N. & Sauval, A.J. (1998) Space Science Reviews 85, 161 Griffinand, M.D., & French, J.R. (1991) *Space Vehicle Design*, AIAA.

Gray, R.O., & Corbally, C.J. (1994) AJ, 107, 742.

549

4479.

2966.2007.12442.x

Biometric technologies such as fingerprint, hand geometry, face and iris recognition are widely used to identify a person's identity. The face recognition system is currently one of the most important biometric technologies, which identifies a person by comparing individually acquired face images with a set of pre-stored face templates in a database.

Though the human perception system can identify faces relatively easily, face reorganization using computer techniques is challenging and remains an active research field. Illumination and pose variations are currently the two obstacles limiting performances of face recognition systems. Various techniques have been proposed to overcome those limitations in recent years. For instance, a three dimensional face recognition system has been investigated to solve the illumination and pose variations simultaneously [Bowyer et al., 2004; S. Mdhani et al., 2006]. The illumination variation problem can also be mitigated by additional sources such as infrared (IR) images [D. A. Socolinsky & A. Selinger, 2002].

Thermal face recognition systems have received little attention in comparison with recognition in visible spectra partially due to the high cost associated with IR cameras. Recent technological advances of IR cameras make it practical for face recognition. While thermal face recognition systems are advantageous for detecting disguised faces or when there is no control over illumination, it is challenging to recognize faces in IR images because 1) it is difficult to segment faces from background in low resolution IR images and 2) intensity values in IR images are not consistent due to the fact that different body temperatures result in different intensity values in IR images.

The overall goal of this research is to develop computational methods for obtaining efficiently improved images. The research objective will be accomplished by integrating enhanced visual images with IR Images through the following steps: 1) Enhance optical images, 2) Register the enhanced optical images with IR images, and 3) Fuse the optical and IR images with the help of Genetic Algorithm.

Section 2 surveys related work for IR imaging, image enhancement, image registration and image fusion. Section 3 discusses the proposed nonlinear image enhancement methods.

Fusion of Visual and Thermal Images Using Genetic Algorithms 189

log(1 ( )) log( 1) '( ) ( 1) <sup>1</sup> log(1 ) log( 1)

α

 +−+ = − <sup>+</sup> +− +

 α

(2)

 α

α

Image registration is a basic task in image processing to align two or more images, usually refereed as a reference, and a sensed image [R. C. Gonzalez et al., 2004]. Registration is typically a required process in remote sensing [L. M. G. Fonseca & B. S. Manjunath, 1996], medicine and computer vision. Registration can be classified into four main categories

• Scene to model registration : Images of a scene taken by sensors and images of the same

It is impossible to implement a comprehensive method useable to all registration tasks and there are many different registration algorithms. The focus is on the feature based registration techniques in this research and they usually consist of the following three steps

• Feature detection: The step tries to locate a set of control points such as edges, line intersections and corners in the image. They could be manually or automatically

• Feature matching: The second step is to establish the correspondence between the features detected in the sensed image and those detected in the reference image. • Transform model estimation, Image resampling and Geometric transformation: The sensed image is transformed and resampled to match the reference image by proper

Each registration step has its specific problems. In the first step, features that can be used for registration must spread over the images and be easily detectable. The determined feature sets in the reference and sensed images must have enough common elements, even though the both images do not cover exactly the same scene. Ideally, the algorithm should be able to

In the second step, known as feature matching, physically corresponded features can be dissimilar because of the different imaging conditions and/or the different spectral sensitivities of the sensors. The choice of the feature description and measuring of similarity has to take into account of these factors. The feature descriptors should be efficient and invariant to the assumed degradations. The matching algorithm should be robust and efficient. Single features without corresponding counterparts in the other image should not

In the last step, the selection of an appropriate resampling technique is restricted by the trade-off between the interpolation accuracy and the computational complexity. In the

according to the manner how the image is obtained [B. Zitova & J. Flusser, 2003]:

• Different times : Images of the same scene taken at different times. • Different sensors : Images of the same scene taken by different sensors.

scene but from a model (digital elevation model).

interpolation techniques [B. Zitova & J. Flusser, 2003].

detect the same features [B. Zitova & J. Flusser, 2003].

affect its performance [B. Zitova & J. Flusser, 2003].

• Different viewpoints : Images of the same scene taken from different viewpoints.

*g l gl L <sup>L</sup>*

parameterizes the non-linear transfer function.

where α

**2.3 Registration** 

[B. Zitova & J. Flusser, 2003].

detected.

Section 4 presents the proposed image fusion algorithm. Section 5 reports the experimental results of the proposed algorithm. Section 6 concludes this research.
