**1. Introduction**

226 Real-Time Systems, Architecture, Scheduling, and Application

[19] Prasad, K. V., Abel, M. S., Khan, S. K., Datti, P. S.: Non-darcy forced convective heat

[20] Prasad, K. V., Abel, M. S., Datti, P. S.: Diffusion of chemically reactive species of a non-

[21] Ziabakhsh, Z., Domairry, G., Bararnia, H., Babazadeh, H.: Analytical solution of flow

immersed in a porous medium. J. Taiwan Inst. Chem. Eng. 41, 22-28 (2010) [22] Kechil, S. A., Hashim, I.: Series solution of flow over nonlinearly stretching sheet with chemical reaction and magnetic fleld. Phy. Lett. A 372, 2258-2263 (2008) [23] Dinarvand, S.: A reliable treatment of the homotopy analysis method for viscous flow

[24] Parand, K., Taghavi, A.: Rational scaled generalized Laguerre function collocation

[25] Parand, K., Taghavi, A., Shahini, M.: Comparison between rational Chebyshev and

[26] Coulaud, O., Funaro, D., Kavian, O.: Laguerre spectral approximation of elliptic

[27] Guo, B. Y., Shen, J., Xu, C. L.: Generalized Laguerre approximation and its applications

[28] Zhang, R., Wang, Z. Q., Guo, B. Y.: Mixed Fourier-Laguerre spectral and Pseudospectral

[29] Wang, Z. Q., Guo, B. Y., Wu, Y. N.: Pseudospectral method using generalized Laguerre

[30] Iranzo, V., Falqus, A.: Some spectral approximations for differential equations in unbounded domains. Comput. Methods Appl. Mech. Engrg. 98, 105-126 (1992)

[32] Parand, K., Dehghan, M., Taghavi, A.: Modified generalized Laguerre function Tau

[33] Parand, K., Shahini, M., Dehghan, M.: Rational Legendre Pseudospectral approach for

[35] Gasper, G. Stempak, K., Trembels, W.: Fractional integration for Laguerre expansions. J.

[36] Taseli, H.: On the exact solution of the Schrodinger equation with quartic

[34] Rajagopal, K., Tao, L.: Mechanics of mixture. World Scientific, Singapore, (1995)

anharmonicity. Int. J. Quantom. Chem. 63, 63-71 (1996)

influence of a magnetic fleld. Cent. Eur. J. Phys. 7, 114-122 (2009)

to exterior problems. J. Comput. Math. 23, 113-130 (2005)

[31] Szeg, G.: Orthogonal polynomils. AMS, New York, (1939)

Heat Fluid Flow. 20, 728-743 (2010)

Math. Appl. Anal. 67, 67-75 (1995)

Media. 5, 41-47 (2002)

(1990)

Comput. 36, 263-283 (2008)

1019-1038 (2009)

8830-8840 (2009)

Linear Mech. 38, 651-657 (2003)

transfer in a viscoelastic fluid flow over a non-isothermal stretching sheet. J. Porous

newtonian fluid immersed in a porous medium over a stretching sheet. Int. J. Non-

and diffusion of chemically reactive species over a nonlinearly stretching sheet

over a non-linearly stretching sheet in presence of a chemical reaction and under

method for solving the Blasius equation. J. Comput. Appl. Math. 233, 980-989 (2009)

modified generalized Laguerre functions Pseudospectral methods for solving Laneemden and unsteady gas equentions. Acta Physica Polonica B. 40, 1749-1763 (2009)

problems in exterior domains. Comput. Method. Appl. Mech. Eng. 80, 451-458

methods for exterior problems using generalized Laguerre functions. J. Sci.

functions for singular problems on unbounded domains. discret. contin. dyn. s. 11,

method for solving laminar viscous flow: The Blasius equation. Int. J. Numer. Meth.

solving nonlinear difierential equations of Lane-Emden type. J. Comput. Phys. 228,

Object detection, or more generally pattern detection and recognition, can be based on many different principles. The objects can be described through their structure, shape, color, texture, etc. [Blaschko & Lampert (2009); Chen et al. (2004); Fidler & Leonardis (2007); Leibe et al. (2008); Lowe (1999); Serre et al. (2005); Viola & Jones (2001)]; therefore, a variety of object detection mechanisms was developed over time. One of the modern approaches to object detection is similarity-based detection where the objects of interest are defined through a set of examples and typically also through a set of counter-examples and the decision whether an object is an object of interest is done through machine learning-based functional block – classifier. The object detection in an image is performed by the application of the classifier on sub-windows of the image.

The focus in this chapter is on statistical binary classifiers whose function is to make a binary decision on whether an image region is or is not an object of interest. The methods of interest include mainly AdaBoost [Freund (1995); Schapire et al. (1998)] whose original purpose was to fuse a small number of relatively well working so-called *weak hypotheses* into one, better working, *strong classifier*. This approach was further developed into an approach, which instead of a small number of weak classifiers, took into account a large number of simple functions and selected suitable weak classifiers automatically from these functions. This method has been demonstrated in the pioneer work of Viola and Jones [Viola & Jones (2001)].

The AdaBoost approach has been further refined and modified [Bourdev & Brandt (2005); Li et al. (2002); Sochman & Matas (2004; 2005)]. Perhaps the most important modification was by Sochman & Matas (2005), called WaldBoost which was based on Wald's sequential decision making [Wald (1947)] combined with AdaBoost. The main advantage of WaldBoost is its significant performance gain comparing it to the AdaBoost classifiers with virtually no change in classification quality.

The detection through classification involves the application of the classifier on a selection of sub-images of the analyzed image. As the classification results of neighboring sub-images may be statistically significantly interdependent, it is worth studying whether the inter-dependencies can be exploited to reduce the computational effort through the prediction of classifier results in certain sub-images, through suppression of unwanted object detection

geometry-based approaches where it cannot be expected that an airplane would be detected

Real-Time Algorithms of Object Detection Using Classifiers 229

As it is obvious from the above description, detection of objects through AdaBoost/WaldBoost methods is dependent on object orientation and size; however, in many applications it is desirable to detect objects regardless of their size or orientation. While this requirement is difficult or often impossible to handle directly in the AdaBoost/WaldBoost machine learning process, the feasible approach is to handle it indirectly through repeating the detection process for different scales and/or orientations. The main reason is that in general, the feature extraction methods (weak classifiers) are not rotation, scale or shift invariant. Therefore, the detection process should be applied repeatedly to *sample* the rotation, scale, etc. in the needed range. The density of image sampling is dependent on the tolerance of the classifier to rotation, scale, etc. The tolerance is in general not predictable and depends on the dataset.

The performance of the object detection is for the large part influenced by underlying feature extraction methods. Two main properties of features extracted from an image exist: *a)* descriptive power and *b)* computational complexity. The goal in rapid object detection is to use computationally simple and, at the same time, descriptive features. In the vast majority of cases, these two properties are mutually exclusive and thus there are computationally simple features with low descriptive power (e.g. isolated pixels, sums of area intensity) or complex and hard to compute features with high descriptive power (Gabor wavelets [Lee (1996)], HoG [Dalal & Triggs (2005)], SIFT and SURF [Bay et al. (2008); Lowe (2004)], etc.). A close to ideal approach is Viola and Jones [Viola & Jones (2001)] with their Haar features calculated in constant time from an integral representation of image. The features used in this chapter are Local Binary Patterns (LBP) [Zhang et al. (2007)], Local Rank Patterns (LRP) [Hradiš et al. (2008)] and Local Rank Differences (LRD) [Zemcik et al. (2007)]. Their main properties are as

• *Strict locality* – Evaluation is based strictly on local data (i.e. no normalization is needed). • *Simple evaluation* – The input is coefficients extracted from an image by convolution with a

All presented features are based on the same model. The only difference is their evaluation function. First, coefficients *vi* from regular 3 × 3 grid (see Fig. 2) are extracted by convolution.

The coefficients are processed by an evaluation function producing the response.


 

rectangular kernel. The coefficients are processed by a simple formula.

 

Fig. 2. Feature samples for LBP (left), LRD and LRP (right)


below walking people in the image.

**3. Efficient feature extraction**

follows.

Fig. 1. Scanning the image with a classifier. Individual sub-images of the image are classified by a classifier (Image source: BioID dataset).

(e.g. multiple detections in very close image locations), or simply through the sharing of intermediate results of the calculations. These aspects of object detection are addressed in this chapter as well.

The structure of the chapter is as follows. The next section gives a brief introduction to object detection with classifiers. Section 3 discusses properties of features extracted from image and describes feature types often used for rapid object detection. Section 4 describes the ideas behind AdaBoost and WaldBoost learning procedures. Acceleration methods for WaldBoost-based detection are introduced in Section 5. Implementation of the detection runtime on different platforms is discussed in Section 6. Some results of the detection acceleration are presented in Section 7, and finally we conclude in Section 8 with some ideas for future research.
