**3. Analysis of rolling bearing fault features based on approximate entropy, sample entropy, and information entropy**

#### **3.1 Nonlinear dynamic analysis of vibration signal**

### *3.1.1 Time domain analysis*

**Figures 5–8** are the time domain signals of four working conditions of rolling bearing in turn.

**Figure 5.** *The normal bearing signal.*

**Figure 6.** *The inner ring fault signal.*

From the time domain image of the vibration signal of the rolling bearing, it can be seen that the vibration signals generated by the bearing under different working conditions are different, although there are differences, but they are not obvious. No matter whether the human eye or the computer can make an accurate judgment only by the vibration signal, the error is great. Therefore, it is necessary to dig deep into the vibration signal data and find out the characteristics of each working condition.

By calculating the entropy of the vibration signal of rolling bearing, its characteristics can be extracted well.

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

*The rolling body fault signal.*

#### *3.1.2 Fast Fourier Transform of power density spectrum*

Fast Fourier transform (FFT), that is, the general name of the efficient and fast computing method of computing discrete Fourier transform (DFT) by computer, is abbreviated as FFT. Fast Fourier transform was proposed by J.W. Cooley and T.W. Tuki in 1965. By using this algorithm, the number of multiplications required by computer to calculate discrete Fourier transform can be greatly reduced. Especially, the more sampling points N are transformed, the more significant the saving of FFT algorithm is.

Fast Fourier transform (FFT) is a method to quickly calculate the discrete Fourier transform or its inverse transform of a sequence. Fourier analysis can convert the signal from the original domain (usually time or space) to the frequency domain for representation. For sequence *x n*ð Þ¼ f g *x*0, *x*1, … , *xN*�<sup>1</sup> , 0≤ *n*< *N*, the discrete Fourier transform expression is:

$$\hat{\boldsymbol{x}}(k) = \sum\_{n=0}^{N-1} \boldsymbol{\varkappa}(n) e^{-i\frac{2\pi nk}{N}} \tag{4}$$

**Figures 9–12** are the Fast Fourier Transform of vibration signals of the rolling bearing in four working conditions.

It can be seen that the normal working conditions have obvious peaks at frequencies of 0,500 and 1000Hz, which are very regular. When the inner ring fails, the peak value is concentrated at 1000�2000Hz. When the rolling element fails, the peak value is concentrated at 1500�1800Hz. When the outer ring fails, the peak values are concentrated in 800�1500Hz and 1700�1800Hz. Therefore, FFT algorithm can not only diagnose signals without noise, but also apply to fault signals with noise.

**Figures 13–16** are the calculation of the power spectrum function of vibration signals of the rolling bearing in four working conditions.

Through the power density spectrum, it can be seen that the power density distribution range of vibration signals of rolling bearings in different working conditions is different, which means that the chaotic degree of this nonlinear system is different, and the entropy is naturally different, so it can be extracted as a feature.

The collected vibration signal is greatly influenced by noise, so the time domain signal cannot be directly extracted. From the analysis of the frequency domain curve, it can be seen that there are peaks at the characteristic frequencies of each working

**Figure 9.** *Normal Working Condition Fast Fourier Transform.*

**Figure 10.** *Inner Ring Fault Fast Fourier Transform.*

condition. Under normal working conditions, the peak amplitude obtained by fast Fourier transform is concentrated at a certain frequency. However, the vibration data of the inner ring fault, outer ring fault, and rolling element fault, and the amplitude peaks obtained by fast Fourier transform are scattered at various frequencies, showing different chaotic phenomena and different chaotic degrees.

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

**Figure 11.** *Rolling Element Fault Fast Fourier Transform.*

**Figure 12.** *Outer Ring Fault Fast Fourier Transform.*

However, the frequency components such as frequency conversion and frequency doubling are not obvious in the frequency domain curve. If the fault degree is light or the fault mode is complex, the characteristic frequency peak of each working condition is likely to be submerged in the noise, and it cannot be identified by the frequency domain diagram alone. However, different chaotic degrees mean different entropy, so entropy can be used as the fault characteristics of rolling bearings.

**Figure 13.** *Normal Working Condition Power Spectral Density Estimate.*

**Figure 14.** *Inner Ring Fault Power Spectral Density Estimate.*

**Figure 15.** *Rolling Element Fault Power Spectral Density Estimate.*

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

**Figure 16.** *Outer Ring Fault Power Spectral Density Estimate.*

#### **3.2 Approximate entropy**

#### *3.2.1 The concept of the approximate entropy*

Approximate entropy ApEn is a non-linear kinetic parameter of sequence proposed by Pincus in 1991. ApEn reflects the degree of self-similarity of the sequence in the pattern [16].

The larger the ApEn value means that it is a complex sequence, and the less likely the system will be able to predict it. It gives cases where the incidence of new patterns increases with dimension, thus reflecting the structural complexity of the data.

From the above, we can know that the rolling bearing produces vibration, and the vibration signal are different in different failure modes. Depending on the physical meaning of ApEn, different signals imply different complexities that can be used as features for rolling bearing fault diagnosis.

### *3.2.2 Fast algorithm for the approximate entropy*

When calculating approximate entropy, too much extra calculation is a waste of time. A fast algorithm to approximate the entropy is presented, exactly as follows:

Let the original sequence be {u (i), i = 0,1, … , N}, r = 0.1�0.25SD (u) (SD indicates the standard deviation of the sequence {u (i)}), then the approximate entropy is more reasonable, select m = 2, N = 500–1000.

Calculated distance matrix D of *N* � *N*. The element in line *i* and column *j* of D is marked as *dij*.

$$d\_{\vec{v}} = \left\{ \begin{array}{c} \mathbf{1} \left| u(i) - u(j) \right| < r \\ \mathbf{0} \left| u(i) - u(j) \right| \ge r \end{array} i = \mathbf{1} \sim N, j = \mathbf{1} \sim N, i \ne j \end{array} \tag{5}$$

Using the elements in D, the and *C*<sup>2</sup> *<sup>i</sup>*ð Þ*<sup>r</sup> <sup>C</sup>*<sup>3</sup> *<sup>i</sup>*ð Þ*r* .

$$\mathbf{C}\_{i}^{2}(r) = \sum\_{j=1}^{N-1} d\_{ij} \mathbf{\hat{n}} d\_{(i+1)(\ \ j+1)} \tag{6}$$

**Figure 17.** *Approximates the entropy curve.*

$$\mathbf{C}\_{i}^{3}(r) = \sum\_{j=1}^{N-2} d\_{\vec{\eta}^{j}} \cap d\_{(i+1)(\ \vec{\eta}+1)} \cap d\_{(i+1)(\ \vec{\eta}+2)} \tag{7}$$

Calculate *H*<sup>2</sup> ð Þ*<sup>r</sup>* and*H*<sup>3</sup> ð Þ*<sup>r</sup>* from *<sup>C</sup>*<sup>2</sup> *<sup>i</sup>*ð Þ*<sup>r</sup>* and *<sup>C</sup>*<sup>3</sup> *<sup>i</sup>*ð Þ*r* , and finally, calculate approximate entropy.

$$\mathbf{ApEn(m, r, N)} = H^n(r) - H^{n+1}(r) \tag{8}$$

The approximate entropy of the sequence can be calculated from the above calculations.

#### *3.2.3 Application of approximate entropy in mechanical fault diagnosis*

Here, take six hundred vibration signals as a group and calculate ten groups of approximate entropy.

From **Figure 17**, it is not difficult to see that when the rolling bearing is normal, the value of the approximate entropy is not large, because, under normal circumstances, the generated signal is relatively single. When the rolling bearing fails, a lot of complicated information is generated, increasing the approximate entropy value. However, under the rolling body failure and normal working conditions, the two approximate entropy value is very similar, can not easily distinguish the two. If only a single entropy feature is used, it is easy to misjudge.

#### **3.3 Sample entropy**

#### *3.3.1 The notion of sample entropy*

In 2000, the concept of sample entropy was first proposed by Richman et al., a similar but more robust time-series complexity metric to approximate entropy, with greater resistance to interference and noise compared to approximate entropy [17].

Sample entropy improves the algorithm of approximate entropy and can reduce the error of approximate entropy when calculating. It is an algorithm similar to the approximate entropy but with superior computational accuracy.

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

Sample entropy has a better agreement. That is, if a time series has higher values than another time series, it also has higher values for other m and r values.

#### *3.3.2 Algorithm for sample entropy*

Generally, r takes 0.1�0.25SD (SD is the standard deviation of raw data), this paper*r* ¼ 0*:*15*SD*, When m = 2 is selected, N = 500–1000.

Assuming the data is f g *Xi* ¼ f g *x*1,*x*2, … , *xN* , with its length N, a m-dimensional vector is reconstructed from the original signal:

$$\mathbf{x}\_{i} = [\mathbf{x}\_{i}, \mathbf{x}\_{i+1}, \dots, \mathbf{x}\_{i+m-1}], \mathbf{i} = \mathbf{1}, \mathbf{2}, \dots, \mathbf{N} - \mathbf{m} \tag{9}$$

Defines the distance between *xi* and *x <sup>j</sup>*:

$$d\_{\vec{\eta}}d\_{\vec{\eta}} = d[\mathbf{x}(i), \mathbf{x}(j)] = \\\\ \text{max}\_{k \in [0, m - 1]}[|\mathbf{x}(i + k) - \mathbf{x}(j + k)| | i, j = \mathbf{1}, 2, \dots, N - m, i \neq j \tag{10}$$

Then find the average value of *B<sup>m</sup> <sup>i</sup>* ð Þ*r*

$$B^m(r) = \frac{1}{N-m} \sum\_{i=1}^{N-m} B\_i^m(r) \tag{11}$$

Similarly, again for the dimension m + 1, repeat the above steps, to obtain, and further obtain the final definition of the sample entropy, when N is a finite number

Similarly, repeat the above steps for dimension m+1 to obtain *B<sup>m</sup>*þ<sup>1</sup> *<sup>i</sup>* ð Þ*r* , and further obtain the final definition of sample entropy of *Bm*þ<sup>1</sup> ð Þ*r* . When n is finite,

$$\text{Sample}(m, r, N) = \ln B^m(r) - \ln B^{m+1}(r) \tag{12}$$

The sample entropy of the sequence data is obtained from the above calculations.

#### *3.3.3 Application of sample entropy in mechanical fault diagnosis*

The first 6,000 data were taken in 600 sets to calculate the sample entropy, as shown in **Figure 18**.

**Figure 18.** *Sample entropy curves.*

From **Figure 18**, it is not difficult to see that the inner circle fault, the roll body fault, and the normal working conditions are very difficult to distinguish, but the outer circle fault can be distinguished.

#### **3.4 Information entropy**

#### *3.4.1 The concept of information entropy*

Information entropy is mostly used as a quantitative indicator of the information content of a system. The information entropy can be further used as a criterion for the optimization of the system equations [18].

#### *3.4.2 Algorithm for information entropy*

$$H(X) = -\sum p(\mathbf{x}i) \log \left( p(\mathbf{x}i) \right) \left( i = 1, 2, \dots, n \right) \tag{13}$$

X represents the random variableð Þ *x*1, *x*2, … *xn* , the value of which is, and p (xi) indicates the probability of an event xi.

#### *3.4.3 Application of information entropy in mechanical fault diagnosis*

The first 6000 data were taken in 600 sets and the information entropy was calculated in **Figure 19**.

As we can see from **Figure 19** the rolling body fault and the normal working condition intersect, and there are several very close places between the inner circle fault and the outer circle fault. If the information entropy is not processed, it is difficult to distinguish the rolling body fault and the normal working condition, and it is also easy to misjudge the inner circle fault and the outer ring fault.
