c.Fourier Transform

This technique applies mainly to stationary periodic signals.

The Fourier transform is an integral transform that expresses a function in terms of sinusoidal base functions, i.e. as the sum or integral of sinusoidal functions multiplied by coefficients. There are several directly related variations of this transform, depending on the type of function to transform. The Fourier transform decomposes a temporal function (a signal) into frequencies, just as a musical instrument string can be expressed as the amplitude (or volume) of its constituent notes. The Fourier transform of a temporal function is a complex frequency value function whose absolute value represents the sum of the frequencies present in the original function and whose complex argument is the phase of displacement of the sinusoidal base at that frequency.

#### d.Wavelet Transform

This technique applies mainly to non-stationary Periodic signals.

Many of the time series exhibit non-stationary behaviors such as changing trends, structural breakdowns from the beginning to the end of the event. These features are often the most important parts of the signal and by applying TF it is not possible to efficiently capture these events.

The wavelet transform is a very useful tool for analyzing these non-stationary series.

The wavelet transform has attractive qualities that make it a very useful method for time series, exhibiting characteristics that could vary both in time and frequency (or scale).

The wavelet transform allows the signal to be decomposed into a set of function bases at different resolution levels and localization times. From these levels it is possible to reconstruct or represent a function, using the wavelet bases and coefficients of these levels appropriately.

#### e.ARMA (p,q) model

This technique applies mainly to stochastic signals.

In the statistical analysis of time series, autoregressive moving average (ARMA) models provide a poor description of a weakly stationary stochastic process in terms of two polynomials, one for autoregression and one for average mobile.

Given a time series of X data, the ARMA model is a tool for understanding and perhaps predicting future values in this series. The model consists of two parts, an autoregressive part (AM) and a moving average part (AM). The AR part involves returning the variable to its own lagged, that is, past values. The AM part involves

**137**

*Real-Time Fault Detection and Diagnosis Using Intelligent Monitoring and Supervision Systems*

modeling the error term as a linear combination of error terms that occur contem-

The model is generally referred to as the ARMA (p,q) model, where p is the order of the autoregressive part and *q* is the order of the moving average part. In signal processing, a time series is a collection of observations made sequentially over time. In linear regression models with cross-section data the order of observations is irrelevant to the analysis, in time series the order of data is fundamental. A very important feature of this type of data is that neighboring observations are dependent and the interest is to analyze and model this dependency. Within probability theory, a stochastic process is a family of random variables representing the evolution of a value system over time. It is the probabilistic counterpart of a deterministic process. Instead of a process that has a single way of evolving, as in the solutions of ordinary differential equations, for example, in a stochastic process there is an indetermination: even if one knows the initial condition, there are sometimes infinite directions in which the process

In discrete time, as opposed to continuous time cases, the stochastic process is a sequence of random variables, such as a Markov chain. The variables corresponding to the various times may be completely different, the only requirement being that these different values are all in the same space, that is, in the contradiction of the function. One possible approach is to model random variables as random functions of one or more deterministic arguments, in most cases, relative to the time parameter. Although the random values of a stochastic process at different times seem to be independent random variables, in the most common situations they exhibit a

At this stage the type and degree of variation, the type of failure, its location, its severity and the incidence on the performance of an element or component are determined, and most particularly the root cause of this disturbance is

In order to achieve the objectives of the fault diagnosis stage, two very important techniques allow the monitoring and supervision system to achieve this intelligent and autonomous decision-making system feature [5, 7, 14]. The application of these two methodologies is the major differential between classical monitoring and supervision and intelligent monitoring and supervision. These two methodologies

They detect faults, locate them and determine their degree of severity. The

It is a simple and often used method to detect faults by checking the limit of a directly measured Y(t) variable. The measured variables of a process are monitored and checked if their absolute values or trends exceed a threshold. An additional

To detect failures in a device of a production system it is necessary to establish or determine variation limits for the variables considered of interest. Usually these

methodologies used in this technique to detect faults are:

a. Fault detection with limit checking

possibility is to check its plausibility.

*DOI: http://dx.doi.org/10.5772/intechopen.90158*

can evolve.

*2.3.2.2 Diagnosis*

identified [8].

are described below:

1.Fault detection methods

complex statistical dependence.

poraneously and at various times in the past.

#### *Real-Time Fault Detection and Diagnosis Using Intelligent Monitoring and Supervision Systems DOI: http://dx.doi.org/10.5772/intechopen.90158*

modeling the error term as a linear combination of error terms that occur contemporaneously and at various times in the past.

The model is generally referred to as the ARMA (p,q) model, where p is the order of the autoregressive part and *q* is the order of the moving average part.

In signal processing, a time series is a collection of observations made sequentially over time. In linear regression models with cross-section data the order of observations is irrelevant to the analysis, in time series the order of data is fundamental. A very important feature of this type of data is that neighboring observations are dependent and the interest is to analyze and model this dependency.

Within probability theory, a stochastic process is a family of random variables representing the evolution of a value system over time. It is the probabilistic counterpart of a deterministic process. Instead of a process that has a single way of evolving, as in the solutions of ordinary differential equations, for example, in a stochastic process there is an indetermination: even if one knows the initial condition, there are sometimes infinite directions in which the process can evolve.

In discrete time, as opposed to continuous time cases, the stochastic process is a sequence of random variables, such as a Markov chain. The variables corresponding to the various times may be completely different, the only requirement being that these different values are all in the same space, that is, in the contradiction of the function. One possible approach is to model random variables as random functions of one or more deterministic arguments, in most cases, relative to the time parameter. Although the random values of a stochastic process at different times seem to be independent random variables, in the most common situations they exhibit a complex statistical dependence.

#### *2.3.2.2 Diagnosis*

*Fault Detection, Diagnosis and Prognosis*

multivariate normal distribution).

sinusoidal base at that frequency.

possible to efficiently capture these events.

This technique applies mainly to stochastic signals.

of two polynomials, one for autoregression and one for average mobile.

d.Wavelet Transform

c.Fourier Transform

predicted value, and calculating a weighted average between the predicted value and the measured value. The highest weight is given to the least uncertainty value. The estimates generated by the method tend to be closer to the actual values than the original measurements, since the weighted average presents a better estimate of uncertainty than both values used in its calculation. From a theoretical point of view, the Kalman filter is an algorithm for efficiently making accurate inferences about a linear dynamic system, which is a Bayesian model similar to a Markov hidden model, but where the state space of the variables is not observed is continuous and all observed and unobserved variables have normal distribution (or often

This technique applies mainly to stationary periodic signals.

This technique applies mainly to non-stationary Periodic signals.

Many of the time series exhibit non-stationary behaviors such as changing trends, structural breakdowns from the beginning to the end of the event. These features are often the most important parts of the signal and by applying TF it is not

The wavelet transform is a very useful tool for analyzing these non-stationary

The wavelet transform has attractive qualities that make it a very useful method

The wavelet transform allows the signal to be decomposed into a set of function bases at different resolution levels and localization times. From these levels it is possible to reconstruct or represent a function, using the wavelet bases and coefficients

In the statistical analysis of time series, autoregressive moving average (ARMA) models provide a poor description of a weakly stationary stochastic process in terms

Given a time series of X data, the ARMA model is a tool for understanding and perhaps predicting future values in this series. The model consists of two parts, an autoregressive part (AM) and a moving average part (AM). The AR part involves returning the variable to its own lagged, that is, past values. The AM part involves

for time series, exhibiting characteristics that could vary both in time and fre-

The Fourier transform is an integral transform that expresses a function in terms of sinusoidal base functions, i.e. as the sum or integral of sinusoidal functions multiplied by coefficients. There are several directly related variations of this transform, depending on the type of function to transform. The Fourier transform decomposes a temporal function (a signal) into frequencies, just as a musical instrument string can be expressed as the amplitude (or volume) of its constituent notes. The Fourier transform of a temporal function is a complex frequency value function whose absolute value represents the sum of the frequencies present in the original function and whose complex argument is the phase of displacement of the

**136**

series.

quency (or scale).

of these levels appropriately.

e.ARMA (p,q) model

At this stage the type and degree of variation, the type of failure, its location, its severity and the incidence on the performance of an element or component are determined, and most particularly the root cause of this disturbance is identified [8].

In order to achieve the objectives of the fault diagnosis stage, two very important techniques allow the monitoring and supervision system to achieve this intelligent and autonomous decision-making system feature [5, 7, 14]. The application of these two methodologies is the major differential between classical monitoring and supervision and intelligent monitoring and supervision. These two methodologies are described below:

#### 1.Fault detection methods

They detect faults, locate them and determine their degree of severity. The methodologies used in this technique to detect faults are:

#### a. Fault detection with limit checking

It is a simple and often used method to detect faults by checking the limit of a directly measured Y(t) variable. The measured variables of a process are monitored and checked if their absolute values or trends exceed a threshold. An additional possibility is to check its plausibility.

To detect failures in a device of a production system it is necessary to establish or determine variation limits for the variables considered of interest. Usually these limits are the maximum values these variables can reach. When the values of the variables reach these limits, it can be inferred that a variable is presenting abnormal changes. If these changes are continuous or discrete, it can be concluded that the device is in a process of failure.

The fault detection threshold verification technique is based on two procedures in order to achieve its goal, namely:


In most situations the binary decision between "normal" and "disturbance" is sometimes artificial, because there is rarely a marked difference between these two states. Therefore, the diffuse threshold procedure is a more realistic alternative for detecting changes in the behavior of variables.

2.Fault diagnosis methods—root cause identification.

Identify the impact a failure has on the performance of an element or device. This is strongly related to the severity of the failure. The important part of the diagnostic step is that the system is able to identify the root cause of a problem.

Many measured signals show oscillations that are either harmonic or stochastic in nature or both [9]. If changes to these signals are related to actuator, process and sensor failures, signal model-based failure detection methods may be applied.

Assuming special mathematical models for signal measurement, appropriate characteristics can be determined. Comparison with observed characteristics for normal behavior provides changes in these characteristics which are considered as analytical symptoms.

The signal model can be divided into non-parametric models, such as frequency spectrum or correlation functions, and into parametric models as amplitudes to distinguish frequencies or ARMA models.

At this stage usually the signals that are analyzed for being frequent in different production processes are focused on the following types of signals:

