*3.3.3 The BOPO*

The BOPO measures the efficiency and ability of the bank to generate profits from its business activities. A smaller BOPO represents the fact that banks can cover their expenses by using their operational revenues. The BOPO is formulated as follows:

$$\text{CIR} = \frac{\text{Total Operating Expenses}}{\text{Total Operating Revenue}} \times 100\% \tag{3}$$

### **3.4 Research estimation method**

Research problems will be analyzed by using vector autoregression (VAR), which is based on the risk of Islamic banks. Technically, if the data is found

stationary at the first difference, the VAR model will be then combined with the error correction model becoming the vector error correction model (VECM). This study refers to the previous study, such as by Ascarya which mathematically develops a general model as:

Risk on Islamic bank, which is formulated as follows:

$$FDR\_t = \Phi\_0 + \Phi\_1 NPV\_t + \Phi\_2 BOPO\_t \tag{4}$$

estimation results. The determination of optimum lag length is then important and

Before testing VAR estimation, a stability test must first be carried out. According to Basuki and Yuliadi [27], the stability of the model needs to be tested because it will affect the results of impulse response function (IRF) and variance decomposition (VDC). If stability is not tested, the results of the IRF and VDC analysis are invalid. A VAR system can be said to be stable or fulfill a stability test if the value of the entire root or root has a modulus smaller than one. In this study, it is known that the modulus value is less than one, which means that the result from

A cointegration test is the test intended to see whether there is a long-term relationship between a particular variable and another variable. In the VECM estimation, a cointegration test is very necessary to determine whether each variable has a relationship in the long-term or just short-term relationship. Technically, if the observed variables do not have a cointegration relationship, then the VECM estimation does not apply. If, the opposite, data had a relationship in the long term

According to Basuki and Yuliadi [27] as stated by Engle-Granger, the existence

of non-stationary variables causes the possibility of a long-term relationship between variables in the system. The cointegration test is performed to determine the existence of the relationship between variables, especially in the long term. If there were cointegration on the variables used in the model, it can be ascertained

that there is a long-term relationship between the variables. The *Johansen cointegration* method can be then used to test the existence of this cointegration.

for deciding on the VECM method can be seen in **Figure 2**.

The VECM is a derivative model of VAR. The difference between VAR and VECM is the VECM estimations, particularly in measuring cointegration condition. If there is a cointegration relationship between variables, it indicates a long-term relationship [27]. VECM is often referred for non-stationary series that has a cointegration relationship. The VECM specification limits the relationship of endogenous variables in the long run to remain convergent in cointegration relationships, but still considers the existence of short-term relationships. The process

The IRF analysis was conducted to check the shock response of each variable. Therefore, the effect of shock from one variable can be explained clearly against other variables. The IRF results prove how long it takes from one variable to

VDC analysis aims to measure the size of the contribution or composition of the influence of each variable to other variable. VDC analysis will provide information

can be computed by using EViews software.

*3.5.3 Testing the stability of VAR models*

*Risk Analyses on Islamic Banks in Indonesia DOI: http://dx.doi.org/10.5772/intechopen.92245*

IRF and VDC analyses is valid.

*3.5.4 Testing cointegration test*

*3.5.5 Applying VECM*

*3.5.6 Applying IRF*

*3.5.7 Applying VDC*

**57**

respond to the other variable.

(cointegration), then VECM is applied.

$$NPF\_t = \Phi\_0 + \Phi\_1 BOPO\_t + \Phi\_2FDR\_t \tag{5}$$

$$BOPO\_t = \Phi\_0 + \Phi\_1 FDR\_t + \Phi\_2 NPF\_t \tag{6}$$

where *FDRt* is the financing-to-deposit ratio; *NPFt* is the nonperforming financing; and *BOPOt* is the cost-to-income ratio.

#### **3.5 Research model and analysis method**

The data analysis technique involves a technique that analyzes data and tests its validity [24]. This study uses parametric inferential statistical techniques, specifically the vector error correction model (VECM) method. It is used to determine the relationship either in the short- or in the long-term relationship among variables. In terms of the research design, the steps for data analysis technique are as follows:

#### *3.5.1 Testing stationary data*

The first step that must be done in the VECM estimation is to test stationary data. The data can be declared stationary if the time series data have a tendency to move toward the average. According to Kuncoro [25], those data are stationary when they are drawn against time. It will often pass through the horizontal axis, and autocorrelation will decrease regularly for a considerable lag. Subsequently, the data are considered as stationary if it meets the following two conditions:


According to Basuki [26], to test the data stationarity, the augmented Dick-Fuller (ADF) test is used. If the t-ADF value is smaller than the MacKinnon critical value, it can be concluded that the data used are stationary or do not contain unit roots. The testing of the roots of this unit is carried out at the level up to the first difference. If the data level is not statistically achieved, a first difference test is necessary.

#### *3.5.2 Selecting lag length criteria*

Time (lag) in economics is used to explain the dependence of one variable on another variable. The determination of lag length is done to determine the parameter estimates in VECM. In the VECM estimation, the causality relationship is strongly influenced by lag length. In addition, Basuki and Yuliadi [27] also explained that if the lag entered is too short, it is feared that the resulted estimation is inaccurate. Conversely, if the lag entered is too long, it will produce inefficient

estimation results. The determination of optimum lag length is then important and can be computed by using EViews software.
