*Yt* = *Xt* − *Xt*−<sup>1</sup>

The differenced data will contain one less point than the original data. Although you can difference the data more than once, one difference is usually sufficient.

b) If the data contain a trend, we can fit some type of curve to the data and then model the residuals from that fit.

c) For non-constant variance, taking the logarithm or square root of the series may stabilize the variance. For negative data, you can add a suitable constant to make all the data positive before applying the transformation. This constant can then be subtracted from the model to obtain predicted (i.e., the fitted) values and forecasts for future points.

#### *White noise*

In according of Cochrane (2005), The building block for our time series models is the white noise process, which I'll denote*εt*. In the least general case,

*ε<sup>t</sup>* ~*i*.*d*.*d*.*N* (0,*σε<sup>t</sup>* 2 ) Notice three implications of this assumption:

$$\text{1. } E(\varepsilon\_t) = E(\varepsilon\_t \mid \varepsilon\_{t-1}, \varepsilon\_{t-2}...) = E(\varepsilon\_t \mid \text{all information at } t-1) = 0$$

$$\text{2. } E\left(\varepsilon\_t \varepsilon\_{t-j}\right) = \text{cov}(\varepsilon\_t \varepsilon\_j) = 0$$

$$\text{3. } \text{var}(\varepsilon\_t) = \text{var}(\varepsilon\_t \mid \varepsilon\_{t-1}, \varepsilon\_{t-2}...) = \text{var}(\varepsilon\_t \mid \text{all information at } t-1) = \sigma^2$$

The first and second properties are the absence of any serial correlation or predictability. The third property is conditional homoscedasticity or a constant conditional variance. Later, we will generalize the building block process. For example, we may assume property 2 and 3 without normality, in which case the *ε<sup>t</sup>* need not be independent. We may also assume the first property only, in which case *εt* is a martingale difference sequence (Cochrane, 2005).

*Summary of time series models:*

#### *Autoregressive models - AR(p)*

The class of models purely autoregressive is defined by:

$$Y\_t = \frac{a\_t}{\varphi\_p(B)}\tag{15}$$

where *φp*(*B*)has p coefficients and *θq*(*B*) has q coefficients. With a combination of models AR (p) and MA (q), it is expected that the models ARMA (p,q) be models extremely parsimoni‐

From the standpoint of adjustment, it is very important because you can adjust more quick‐ ly. The condition of stationary and invertibility of a ARMA (p, q) require that all p roots of f

The class of autoregressive-integrated-moving-average models are defined by the equation,

to an integrator positive *d*. Made the differentiation of the series *d* times necessary to make it stationary, then the ARIMA(p, d, q) model can be adjusted through the ARMA(p,q) model

According to Fischer (1982), the appearance of some short-term cyclical behavior is called seasonality. For a full treatment about series of time, need to characterize and eliminate this

Seasonality means a tendency to repeat a certain behavior of the variable that occurs with some regularity in time. That is, are those series that have variations of a similar amount of time to another, characterized by showing high serial correlation between observations of the variable spaced by the period of seasonality, and, of course, the serial correlation be‐

Similar to the process ARIMA (p,d,q) this process develops the model in one of three basic forms of description of each value of*Yt*, and applies the same procedures developed for a model where the seasonal component is not present. After establishing the value of the vari‐ able in period t+h, then applies the expectancy operator. Forecast errors, confidence inter‐

This method for the prediction is based on the setting called tentative ARIMA models, has a flexible modeling methodology that forecasts are made from the current and past values of these series. Therefore, describing both the stationary behavior as the non-stationary zero. ARIMA models are able to describe the process of generating a variety of series for forecast‐ ers (corresponding to the filters) without taking into account the economic relations, for ex‐

The determination of the best model for "Box and Jenkins" methodology following this steps

vals and updating are treated similarly to the ARIMA model (Fischer, 1982).

*<sup>d</sup>* (18)

Applications of Control Charts Arima for Autocorrelated Data

http://dx.doi.org/10.5772/50990

43

(B) 0 and all the q roots of q (B) 0 fall outside the unit circle (Russo, et al, 2009).

*Yt* <sup>=</sup> *<sup>θ</sup>q*(*B*)*at φp*(*B*)(1− *B*)

*Autoregressive Integrated Moving Averages Models - ARIMA (p,d,q)*

cyclic function of time to become the condition of stationarity.

ample that generated the series (Morretin and Toloi, 2006).

ous, using few coefficients to explain the same serie.

(Russo, et al, 2009).

*Sazonal Model - SARIMA*

tween the next observations.

*Box-Jenkins Methodology*

(Leroy, 2006):

where *φp*(*B*) has p coefficients. The AR (p) assumes that the result is the weighted sum of its p past values than white noise.

The condition of stationarity of the AR (p) states that all the p roots of the characteristic equation fall outside the unit circle (Russo, et al, 2006).

#### *Moving average models - MA(q)*

According to Russo, et al (2009), the class of moving averages models is defined by

$$Y\_t = \Theta\_q(B)a\_t\tag{16}$$

where *θq*(*B*) has q coefficients. The models MA (q) resulting from the linear combination of random shocks that occurred during the current and past periods.

The invertibility condition requires that all roots of the characteristic equation fall outside the unit circle.

*Autoregressive and moving average models - ARMA (p,q)*

The class of models, autoregressive-moving average is of type

$$Y\_t = \frac{\Theta\_q(\mathcal{B}) a\_t}{q \sigma\_p(\mathcal{B})} \tag{17}$$

where *φp*(*B*)has p coefficients and *θq*(*B*) has q coefficients. With a combination of models AR (p) and MA (q), it is expected that the models ARMA (p,q) be models extremely parsimoni‐ ous, using few coefficients to explain the same serie.

From the standpoint of adjustment, it is very important because you can adjust more quick‐ ly. The condition of stationary and invertibility of a ARMA (p, q) require that all p roots of f (B) 0 and all the q roots of q (B) 0 fall outside the unit circle (Russo, et al, 2009).

*Autoregressive Integrated Moving Averages Models - ARIMA (p,d,q)*

The class of autoregressive-integrated-moving-average models are defined by the equation,

$$Y\_f = \frac{\Theta\_q(B)a\_t}{q\rho\_p(B)(1-B)^d} \tag{18}$$

to an integrator positive *d*. Made the differentiation of the series *d* times necessary to make it stationary, then the ARIMA(p, d, q) model can be adjusted through the ARMA(p,q) model (Russo, et al, 2009).

#### *Sazonal Model - SARIMA*

Notice three implications of this assumption:

)=0

)=cov(*εtε<sup>j</sup>*

42 Practical Concepts of Quality Control

*Summary of time series models:*

*Autoregressive models - AR(p)*

p past values than white noise.

*Moving average models - MA(q)*

the unit circle.

2. *E*(*εtεt*<sup>−</sup> *<sup>j</sup>*

1. *E*(*εt*)=*E*(*ε<sup>t</sup>* |*εt*−1, *εt*−2...)= *E*(*ε<sup>t</sup>* |all information at *t* −1)=0

The class of models purely autoregressive is defined by:

equation fall outside the unit circle (Russo, et al, 2006).

*Autoregressive and moving average models - ARMA (p,q)*

The class of models, autoregressive-moving average is of type

*Yt* =

random shocks that occurred during the current and past periods.

3. var(*εt*)=var(*ε<sup>t</sup>* <sup>|</sup>*εt*−1, *<sup>ε</sup>t*−2...)=var(*ε<sup>t</sup>* |all information at *<sup>t</sup>* <sup>−</sup>1)=*<sup>σ</sup>* <sup>2</sup>

The first and second properties are the absence of any serial correlation or predictability. The third property is conditional homoscedasticity or a constant conditional variance. Later, we will generalize the building block process. For example, we may assume property 2 and 3 without normality, in which case the *ε<sup>t</sup>* need not be independent. We may also assume the first property only, in which case *εt* is a martingale difference sequence (Cochrane, 2005).

*Yt* <sup>=</sup> *at*

According to Russo, et al (2009), the class of moving averages models is defined by

where *φp*(*B*) has p coefficients. The AR (p) assumes that the result is the weighted sum of its

The condition of stationarity of the AR (p) states that all the p roots of the characteristic

where *θq*(*B*) has q coefficients. The models MA (q) resulting from the linear combination of

The invertibility condition requires that all roots of the characteristic equation fall outside

*θq*(*B*)*at*

*<sup>φ</sup>p*(*B*) (15)

*Yt* =*θq*(*B*)*at* (16)

*<sup>φ</sup>p*(*B*) (17)

According to Fischer (1982), the appearance of some short-term cyclical behavior is called seasonality. For a full treatment about series of time, need to characterize and eliminate this cyclic function of time to become the condition of stationarity.

Seasonality means a tendency to repeat a certain behavior of the variable that occurs with some regularity in time. That is, are those series that have variations of a similar amount of time to another, characterized by showing high serial correlation between observations of the variable spaced by the period of seasonality, and, of course, the serial correlation be‐ tween the next observations.

Similar to the process ARIMA (p,d,q) this process develops the model in one of three basic forms of description of each value of*Yt*, and applies the same procedures developed for a model where the seasonal component is not present. After establishing the value of the vari‐ able in period t+h, then applies the expectancy operator. Forecast errors, confidence inter‐ vals and updating are treated similarly to the ARIMA model (Fischer, 1982).

#### *Box-Jenkins Methodology*

This method for the prediction is based on the setting called tentative ARIMA models, has a flexible modeling methodology that forecasts are made from the current and past values of these series. Therefore, describing both the stationary behavior as the non-stationary zero. ARIMA models are able to describe the process of generating a variety of series for forecast‐ ers (corresponding to the filters) without taking into account the economic relations, for ex‐ ample that generated the series (Morretin and Toloi, 2006).

The determination of the best model for "Box and Jenkins" methodology following this steps (Leroy, 2006):

#### *Identification*

Identification is the most critical phase of the "Box and Jenkins" methodology, it is possible that several researchers to identify different models for the same series, using different crite‐ ria of choice (ACF, PACF, Akaike, etc..). Typically, the models should be parsimonious. The study analyzes the ACF and PACF, and attempts to identify the model. The process seeks to determine the order of (p,d,q), based on the behavior of the Autocorrelation Functions (ACF) and Partial Autocorrelation (PACF), as well as their respective correlograms.

**5. Methodology and Results**

ces control.

are independent.

trol charting the residuals.

confirm the autocorrelation's suspected.

cient for thread's resistance is defined as

, *k* =0,1,2, …

(*xt* <sup>−</sup> *<sup>x</sup>*¯)(*xt*+*<sup>k</sup>* <sup>−</sup> *<sup>x</sup>*¯)

(*xt* <sup>−</sup> *<sup>x</sup>*¯)2

where *k* = time periods ahead

*n* = total number of data

supports this belief.

*rk* =

∑ *t*=1 *n*−*k*

> ∑ *t*=1 *n*

In this work we analyzed the Têxtil Oeste Ltda industry, whose Statistical Control of Proc‐ esses implantation happened in 1999. Here, we limited to analyze the control charts for con‐ tinuous variables as tools used for the control of the process. The conventional Shewhart control charts were used added of other appropriated models to transformations of autocor‐

Applications of Control Charts Arima for Autocorrelated Data

http://dx.doi.org/10.5772/50990

45

In thread's polypropylene process there are several outputs to consider critical. One of these outputs is the thread's resistance. In an effort to develop a control plan to assure quality of the appropriate surface, it was certain that the resistance has a main impact on surface quali‐ ty of the thread. So, to verify the quality of the thread, it's resistance should be controlled.

At once, the data used in this study is the daily data of the thread's polypropylene resistan‐

These data are for the models identification and estimation and for the models predictive capacity analysis. Before control charts be applied, three fundamental assumptions must be met: The process is under control; the data are normally distributed; and the observations

Montgomey (2009) considers that the points out of control are stipulated reasonably well for the controls charts of Shewhart when the normality assumption is somewhat violated, but when observations aren't independent, control charts yield deceiving results. Many process‐ es don't produce independent observations. Alwan (1991) describes a method for control charting with autocorrelated data. The method involves fitting a time series curve and con‐

It was made a study that helped to verify where it is the largest instability of the process, so that we can make a better control of the system. It is suspected that the daily thread's resist‐ ance data aren't independent, and the result of a plot of these data, as showing in Figure 4,

The problem is to implement statistical control for a process that has autocorrelation (Dob‐ son, 1995). The Figure 4 shows us the great data variability. Calculations were performed to

Calculations were done to confirm the suspected autocorreation. The autocorrelation coeffi‐

relations data in data that are independent and usually distributed.

#### *Estimation:*

After identifying the best model should then adjust and examine it. The adjusted models are compared using several criteria. One of the criteria is the of parsimony, in which it appears that the incorporation of coefficients additional improves the degree of adjustment (increas‐ es the R2 and reduces the sum of squared residuals) model, but you reduces the degrees of freedom. One of ways to improve the degree of adjustment of this model to time series data is to include lags additional in Cases AR (p), MA (q), ARMA (p, q) and ARIMA.

The inclusion of additional lags implies increasing the number of repressors, which leads to a reduction in the sum of squared residuals estimated. Currently, there are several criteria for selection of models that generate a trade-off between reductions in the sum of squared residuals and estimated a more parsimonious model.

Generally, when working with lagged variables are lost about the time series under study. Therefore, to compare alternative models (or competitors) should remain fixed number of information used for all models compared.

#### *Checking:*

Aspiring to know the efficacy of the model found, takes place waste analysis. If the residuals are autocorrelated, then the dynamics of the series is not completely explained by the coeffi‐ cients of the fitted model. It should be excluded from the process of choosing the model(s) with this feature.

An analysis of existence (or not) of serial autocorrelation of waste is made based on the func‐ tions of autocorrelation and partial autocorrelation of waste and their respective correlo‐ grams. It is noteworthy that, when estimating a model, it is desired that the error produced by it have characteristic "white noise" that is, this will be independent and identically dis‐ tributed (i.i.d. condition).

#### *Forecast:*

Predictions can be ex-ante, made to calculate future values of short-term variable in the study. Or, ex-post held to generate values within the sample period. The better these last, the more efficient the model estimated. We choose the best model throught the lower Mean Absolute Percentage Error (MAPE). It is a formal measure of the quality of forecasts ex-post. There‐ fore, the lower value of the MAPE is the best fit of forecasts of the model to time series data.
