*3.6.1 Feature extraction*

The three entropy have considerable disadvantages under independent judgment, and it is easy to appear misjudgment, which needs to be further handled. So I began to extract the fault features of rolling bearings by jointly analyzing the approximate entropy, sample entropy, and information entropy.

#### *Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

The data are taken from the Western Reserve University, with a vibration acceleration signal in four different modes under a load of 2 horsepower, a fault diameter of 0.1778 mm, and a rotational speed of 1750 r/min. A total of four groups correspond to different patterns, with 60000 data, 10 segments, 6000 data per segment, with 10 segments, and 600 data per segment.

Each section finds an approximate entropy, sample entropy, information entropy, ten entropy values for each group, you can obtain ten approximate entropy mean, sample entropy mean, and information entropy mean. See **Figures 20–22**.

From the above figure, it is not difficult to see that some working conditions are still difficult to distinguish if you want to pass a single entropy feature. Approximate entropy has certain errors in the diagnosis of normal working condition and outer working condition, sample entropy has certain errors in the diagnosis of inner ring fault and rolling element fault, and information entropy has certain errors in the diagnosis of normal working condition and rolling element fault.

Therefore, as long as the three entropy means are extracted in each fault mode, four sets of column vectors are formed, such as **Table 5**, and each column corresponds to the entropy feature vector under the working condition.

The same entropy feature column vectors obtained from the same test data form the entropy feature matrix of the test data.

Based on the average entropy feature vector, we compare it with the test data entropy feature vector, that is, we take the absolute value after making the difference to obtain four new vectors.

**Figure 20.** *Approximates the entropy mean.*

**Figure 21.** *Mean sample entropy.*

**Figure 22.** *Information entropy mean.*


### **Table 5.**

*Entropy feature matrices.*

The four new vectors are formed into an entropy feature matrix, and the minimum of each row in the matrix is taken. The largest number of columns corresponding to the minimum is the judged test data closest to the failure mode.

However, in rare cases, three entropy features will determine the three failure modes. At this time, the failure mode corresponding to the approximate entropy mean with the greatest discrimination will be taken as the final failure mode of the test data.

#### *3.6.2 Simulation experiment*

After the early extraction of the rolling bearing fault characteristics, we have extracted the approximate entropy mean, sample entropy mean and information entropy mean in the four-fault modes of the rolling bearing. We can effectively distinguish the four-fault modes by comparing the feature vectors.

A matrix with a data quantity of 6000 was randomly generated using the data, and the fault features were extracted and determined to be a working condition. Later, the results were compared with the previous established standard, each group was tested 500 times, at least 10 groups, the test accuracy rate was based on the above description, and the simulation experiment began.

Also taken from Western Reserve University, the vibration acceleration signal in four different modes under a load of 2 HP, fault diameter of 0.1778 mm, and rotational speed of 1750 r/min.

The data of 6000 outer circle faults from the last 60000 data were divided into ten sections and 600 data for each section to calculate approximate entropy, sample entropy and fuzzy entropy mean to form a column vector.

The specific process is shown in **Figure 23**.

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*

#### **Figure 23.** *Test process.*

Choose any one of the four random working conditions as the test condition, extract 6000 vibration signals, and calculate the approximate entropy, sample entropy, and information entropy in groups of 600, and calculate the average value. The entropy mean is arranged as a test vector. The test vector is different from the corresponding components in the four working condition feature vectors, and the absolute value is taken. Then the test condition is the same as the condition corresponding to the feature vector with the smallest absolute value difference of each component. For the convenience of understanding and observation, the difference between the test vector and the feature vector group is visualized, resulting in **Figure 24**.

**Figure 24.** *Entropy difference 1.*

It can be clearly seen from the figure that the absolute value of the difference between the yellow line characteristic vector corresponding to the inner ring fault and the test vector is the smallest, so the test condition is the inner ring fault (**Figure 25**).

It can be clearly seen from **Figure 16** that the absolute value of the difference between the purple line feature vector corresponding to the rolling element fault and the test vector is the smallest, so the test condition is the rolling element fault.

#### *3.6.3 Comparative Experiment*

In order to further prove the effectiveness of this method, a comparative experiment with the single entropy feature was carried out. The accuracy of the method using triple entropy combined features is 98.28%. The accuracy of the method characterized by single approximate entropy is 89.64%. The accuracy of the method characterized by single sample entropy is 79.80%. The accuracy of the method characterized by single information entropy is 71.96%. The four methods were repeated ten times respectively. **Table 6** shows the accuracy and average results of the experiment.

The accuracy of the triple entropy combination feature method is 8.64% higher than that of the single approximate entropy method, 18.48% higher than that of the single sample entropy method, and 98.28% higher than that of the single information entropy method. It can be seen that the method of triple entropy combination features can effectively diagnose the rolling bearing faults.

**Figure 25.** *Entropy difference 2.*

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*


#### **Table 6.**

*Comparison of accuracy of four methods.*
