**3.5 Approximate entropy, sample entropy, information entropy, and maximum Lyapunov exponent**

Taking 6000 data from each of four working conditions randomly, and taking 600 data as a group, the approximate entropy, sample entropy, information entropy, and maximum Lyapunov exponent are calculated. See **Tables 1–4** for specific results.

**Figure 19.** *Information entropy curve.*

*Perspective Chapter: On Rolling Bearing Fault Feature Extraction Based on Entropy Feature DOI: http://dx.doi.org/10.5772/intechopen.105095*


#### **Table 1.**

*Normal working condition.*


#### **Table 2.** *Inner ring fault.*

It can be seen from the above data that there is a certain correlation between the maximum Lyapunov exponent and entropy, but it can only show that the rolling bearings are in different degrees of chaotic systems under various working conditions. The maximum Lyapunov exponent has little discrimination in various working conditions, so it is still difficult to diagnose the fault of rolling bearings.

At the same time, we can see that the entropy with the highest correlation with the maximum Lyapunov exponent is also different under different working conditions of rolling bearings. For example, when the outer ring fails, the sample entropy can better improve the chaos phenomenon, but it is quite different from the maximum Lyapunov exponent under normal conditions.

In order to improve the chaotic phenomenon and diagnose the fault of rolling bearing, this paper does not use single entropy as the feature but uses approximate

#### *Chaos Monitoring in Dynamic Systems – Analysis and Applications*

