**4.2 Data pre-processing**

248 Fuzzy Inference System – Theory and Applications

*Transform* (GHT) approach can be used (Ballard, 1981). GHT can be implemented basing on

In the field of quality control there are two main elements which play an important role: the presence of sensors used to capture data, such as signals or images, and the adopted computational intelligence techniques (Piuri & Scotti, 2005). The quality monitoring includes the use of signal measurements or machine visual systems in order to allow a standardized and non-invasive control of industrial production processes. The computational intelligence techniques comprise the formalisation of the mechanism which allows the extraction of useful information from the images and its interpretation for the purposes the systems it is designed for; therefore it may also include components such as neural networks, fuzzy systems and evolutionary computer algorithms. A generic quality control system needs to

manage techniques belonging to several scientific areas, such as depicted in Fig.2.

In the following, a brief explanation of all the blocks included in Fig. 2 is provided.

The data acquisition is a typical problem concerning measurements systems. A lot of studies demonstrate how the computational intelligence techniques can improve the performance of

the discrete representation given by tabular functions.

Fig. 2. Generic scheme of quality control system

**4.1 Data acquisition** 

**4. Industrial quality control** 

The main aim of signal pre-processing is to reduce the noise and to make use of inherent information provided by signals. In literature many conventional pre-processing techniques have been proposed (Proakis & Manolakis, 1996; Rabiner & Gold, 1975) including computational intelligence techniques; in this context a good survey of neural and fuzzy approaches for signal pre-processing is due to Widrow and Sterns (Widrow & Stern, 1985).

If the captured data consist of an image, pre-processing phase is used to correct image acquisition and not perfect source image conditions. In each system, which implements machine vision functionalities, a pre-processing phase is recommended in order to correct image acquisition errors or to improve characteristics for visual inspection.

Image pre-processing is a phase which, through several operations, improves the image by suppressing undesirable distortions or enhancing relevant features for the further analysis tasks. Note that image pre-processing does not add information content to the image (Haralik & Shapiro, 1992; Hlavak et al., 1998) but uses the redundancy basing on the concept that a real object has similar neighbouring pixels which correspond to a similar brightness value. A distorted pixel can be removed from the image and it can also be reinserted in the image having a value equal to the average of the neighbouring pixels.

The main operations included in the pre-processing image phase are resumed as follow:


*Cropping* is introduced to remove some parts of the image in order to point out the regions of interest.

*Image filtering* exploits a small neighbourhood of a pixel belonging to the input image in order to provide a new brightness value in the output image.

*Smoothing* techniques are used to reduce noise or eventual fluctuations occurring in the image. To reach this task it is necessary to suppress high frequencies in the Fourier transform domain.

*Brightness threshold* is a fundamental operation to extract pertinent information. It consists in a gray scale transformation whose result is a binary image. This approach is based on the segmentation and separates objects from their background.

*Edge Detection* is a very important step in image pre-processing. Edges are pixels lying where the intensity of image charges roughly. In previous paragraph the edge detection method is treated in more details.

Fuzzy Inference Systems Applied to Image Classification in the Industrial Field 251

Fuzzy Logic has been introduced by Zadeh (Zadeh, 1965) and it is based on the concept of "partial truth", i.e. truth values between "absolutely true" and "absolutely false". Fuzzy Logic provides a structure to model uncertainty, the human way of reasoning and the perception process. Fuzzy Logic is based on natural language and through a set of rules an inference system is built which is the basis of the fuzzy computation. Fuzzy logic has many advantages, firstly it is essential and applicable to many systems, moreover it is easy to understand and mostly flexible; finally it is able to model non linear functions of arbitrary complexity. The Fuzzy Inference System (FIS) is one of the main concepts of fuzzy logic and

A FIS is a way of mapping input data to output data by exploiting the fuzzy logic concepts. Fuzzification is used to convert the system inputs, which is represented by crisp numbers into fuzzy set through a fuzzification function. The fuzzy rule base is characterized in the form of *if-then* rules and the set of these fuzzy rules provide the rule base for the fuzzy logic system. Moreover the inference engine simulates the human reasoning process: through a suitable composition procedure, all the fuzzy subsets corresponding to each output variable, are combined together in order to obtain a single fuzzy for each output variable. Finally the defuzzification operation is used to convert the fuzzy set coming from the inference engine

Fuzzy classification is an application of fuzzy theory. In fuzzy classification an instance can belong to different classes with different membership degrees; conventionally the sum of the membership values of each single instance must be unitary. The main advantage of fuzzy

The quality control in industrial applications is used to monitor and to guarantee the quality

classification based method includes its applicability for very complex processes.

**5. Fuzzy classifier** 

Fig. 3. FIS scheme

of the processes.

into a crisp value (Abraham, 2005).

**6. Exemplar industrial applications** 

the general scheme is shown in Fig.3.

### **4.3 Features extraction and selection**

With the previous operation all features that are processed by sensors have been fixed. Through feature extraction and selection the initial data can be reduced in order to diminish the computational complexity of the system. Moreover a reduction of features number simplifies both the pattern representation and the classifier structure; finally a reduction of features number solve the problem of "curse of dimensionality" (Roudys & Jain, 1991). The so-called curse of dimensionality problem consists in the fact that the number of instances for feature exponentially increases with the number of features itself; also in order to reduce the complexity of the computational intelligence modules under training, it is fundamental to limit the number of features to consider. Both feature extraction and feature selection are used for the reduction of the feature space. The main difference between the two approaches is that the feature extraction approach generates new features based on transformation or combination of the original features while feature selection approach selects the best subset of the original feature set (Dalton, 1996).

## **4.4 Data fusion**

This operation combines the available features in order to obtain more significant information concerning the quality of the industrial process under consideration. A widely used technique is the so called *sensor fusion*, which combines information of different type coming from several sensors. A lot of papers, concerning the use of intelligent techniques have been proposed, such as (Bloch, 1996; Filippidi et al., 2000; Xia et al., 2002; Benediktsson et al., 1997). Data fusion systems can be composed by several elements such as sensors, data-fusion nodes, data-fusion databases and expert knowledge databases.

#### **4.5 Classification**

Once the features are fixed, they are led in input to a classifier which outputs a value associated to the classification of the quality (integer value) or a quality index (real value).

The classification can be divided into two approaches: conventional classification and computational intelligence-based classification. The computational intelligence-based approach includes statistical approach (Fukunaga, 1972), neural networks (Haykin, 1999) and fuzzy systems (Bezdek, 1992). This last issue will be treated in the next section.

### **4.6 System optimization**

Modules belonging to the quality control system contain parameters which need to be fixed in order to improve final accuracy, computational complexity, maximum possible throughput and memory exploitation. These parameters include, for instance, thresholds, filter coefficients and number of hidden neurons in the case of use of neural network.

In order to build a satisfactory quality control system it is important to integrate all the above cited activities. In order to obtain more accurate, adaptive and performing systems the use of computational intelligence techniques are recommended.
