**8. Conclusion**

162 Principal Component Analysis

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59

Fig. 16. Smile Stage Classification Recognition Rate Based on the Maximum Value Selection

1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 Number of Dimensions

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 Number of Dimensions

Fig. 18. Smile Stage Classification Recognition Rate Based on the Maximum Value Selection

Fig. 17. Smile Stage Classification Recognition Rate Based on the Maximum Value Selection

of Kernel Linear Preserving Projection Method Using 1st Scenario

of Kernel Linear Preserving Projection Method Using 2nd Scenario

of Kernel Linear Preserving Projection Method Using 3rd Scenario

> Angular Separation Canberra

> Angular Separation Canberra

Classifacation Rate (%)

Angular Separation Canberra

Classification Rate (%)

For both the maximum non-linear feature selection of Kernel Principal Component Analysis and Kernel Linear Preserving Projection has yielded local feature structure for extraction, which is more important than global structures in feature space. It can be shown that, the maximum non-linear feature selection of Kernel Principal Component Analysis for face recognition has outperformed the PCA, LDA/QR and LPP/QR on the ORL and the YALE face databases. Whereas the maximum value selection of Kernel Linear Preserving Projection as extension of Kernel Principal Component Analysis has outperformed the 2D-PCA+SVM and the PCA+LDA+SVM for smile stage classification.

**0**

**9**

**FPGA Implementation for GHA-Based**

Shiow-Jyu Lin1,2, Kun-Hung Lin1 and Wen-Jyi Hwang<sup>1</sup> <sup>1</sup>*Department of Computer Science and Information Engineering,*

Principal components analysis (PCA) (Alpaydin, 2010; Jolliffe, 2002) is an effective unsupervised feature extraction algorithm for pattern recognition, classification, computer vision or data compression (Bravo et al., 2010; Zhang et al., 2006; Kim et al., 2005; Liying & Weiwei, 2009; Pavan et al., 2007; Qian & James, 2008). The goal of PCA is to obtain a compact and accurate representation of the data that reduces or eliminates statistically redundant components. Basic approaches for PCA involve the computation of the covariance matrix and the extraction of eigenvalues and eigenvectors. A drawback of the basic approaches is the high computational complexity and large memory requirement for data with high vector dimension. Therefore, these approaches may not be well suited for real time

A number of fast algorithms (Dogaru et al., 2004; El-Bakry, 2006; Gunter et al., 2007; Sajid et al., 2008; Sharma & Paliwal, 2007) have been proposed to reduce the computation time of PCA. However, only moderate acceleration can be achieved because most of these algorithms are based on software. Although hardware implementation of PCA and its variants are possible, large storage size and complicated circuit control management are usually necessary. The PCA hardware implementation may therefore be possible only for

An alternative for the PCA implementation is to use the generalized Hebbian algorithm (GHA) (Haykin, 2009; Oja, 1982; Sanger, 1989). The principal computation by the GHA is based on an effective incremental updating scheme for reducing memory utilization. Nevertheless, slow convergence of the GHA (Karhunen & Joutsensalo, 1995) is usually observed. A large number of iterations therefore is required, resulting in long computational time for many GHA-based algorithms. The hardware implementation of GHA has been found to be effective for reducing the computation time. However, since the number of multipliers in the circuit grows with the dimension, the circuits may be suitable only for PCA with small dimensions. Although analog GHA hardware architectures (Carvajal et al., 2007; 2009) can be used to lift the constraints on the vector dimensions, these architectures are difficult to be

**1. Introduction**

applications requiring fast feature extraction.

directly used for digital devices.

small dimensions (Boonkumklao et al., 2001; Chen & Han, 2009).

**Texture Classification**

*National Taiwan Normal University and* <sup>2</sup>*Department of Electronic Engineering,*

*National Ilan University*

*Taiwan*

#### **9. References**

