**2. The earthquake catalog used**

The earthquake catalog we utilized was integrated with the historical catalog for occurrences before 1970/01/01 and the contemporary catalog from 1970/01/01 to 2022/11/20 given by the China Earthquake Networks Center (CENC). **Figure 2** illustrates the capacity to record earthquake events in the CSES area, where ancient Chinese recording mentions some 8+ earthquakes, by varying the event number plot inside each magnitude bin and the magnitude-sequence number plot [18, 19]. As a consequence of the construction of the seismic station, and the disturbing occurrence of a few big earthquakes, the result indicates that there are certain peculiar time nodes that disclose the variation of monitoring capability in this area.

To obtain the completeness distribution of the modern earthquake catalog after 1970, many statistical methods based on the earthquake catalog [20] are usually used. **Figure 3** shows the assessment of the completeness state using the Best Combination method (Mc95-Mc90-Max curvature) [20], which reveals the temporal variation mainly depending on the development of observational facilities. The results for the catalog before 1970 give a completeness magnitude of around 5.0–6.0, however, there is a significant error since there are so few records, compared with that of the modern earthquake catalog. In general, the magnitude threshold of the historical earthquake catalog should be chosen as 4.0–4.5 to ensure the completeness level and enough sample size if a statistical algorithm would be used to analyze the sequence. For the catalog after 1970, 3.0–4.0 can be determined to be the completeness magnitude level [5]. On the other hand, strong earthquakes, such as the seismic sequence of Wenchuan 8.0 in 2008 and Lushan 7.0 in 2013, have a substantial influence on the catalog's

*The "Natural Time" Method Used for the Potential Assessment for Strong… DOI: http://dx.doi.org/10.5772/intechopen.110023*

#### **Figure 1.**

*Spatial distribution of earthquakes with magnitude above 6.0 for the period from 700 B.C. to A.D. 2022 in CSES region. The dots in dark yellow indicate the strong earthquakes up to 1970. Earthquakes that occurred since 1970/ 01/01, are shown in light yellow. The area being studied is shown in the indexing graphic at the upper right. The three dashed circles indicate the region with a radius of 100, 150, and 200 km, respectively, used in the nowcasting analysis. Center of these circles (the star in red) is the epicenter of the September 5, 2022, Luding* M*S6.8 earthquake; May 21, 2021, Yangbi* M*S6.4 earthquake; October 7, 2014, Jinggu* M*S6.6 earthquake; and August 3, 2014, Ludian* M*S6.5 earthquake.*

completeness owing to observational limitations and regular seismological interpretation, which should be addressed in the future. The cut-off magnitude of 4.0 may be interpreted as the magnitude threshold in the following statistical models as a global estimate.

A "mixed" magnitude system is used in the earthquake catalog since 1970. The local type of magnitude (*M*L) is utilized for events with magnitudes under 4.0. Surface wave type of magnitude (*M*S) is utilized for earthquakes greater than magnitude 5.5. The change from *M*<sup>L</sup> to *M*<sup>S</sup> is not well defined for the events in between. Only "magnitude" is used in this research to indicate the size of events due to the magnitude uncertainty. The outcomes of the numerical calculations conducted in this chapter, such as the determination of the Earthquake Potential Score using the nowcasting technique, are unaffected by the magnitude uncertainty [21–24]. In line with Rundle's research [24], we used a threshold magnitude of 4.0 and a target magnitude of larger

#### **Figure 2.**

*Variation of event number within each magnitude bin (top) and the magnitude-sequence number plot (bottom) of the catalog of CSES region from 700 B.C. to A.D. 2022/11/20. In the event number plot, the number window of the* x*-axis is chosen as 500. The text above the* x*-axis shows the calendar time. In the magnitude-sequence plot, to avoid the difficulty to observe the little distribution of magnitude, a random of [0.05, 0.05] is added into the fluctuation of the magnitude bin.*

#### **Figure 3.**

*Completeness magnitude and cumulative number analysis for the earthquake catalog of CSES. The above and bottom plots show the result of 700 B.C. to 1970 and 1970 to 2022/11/20. The black line in the plot indicates the completeness magnitude calculated using the Best Combination method (Mc95-Mc90-Max curvature) [20]. The blue lines show the cut-off magnitude level considering the global temporal variation of completeness magnitude. The dark green line shows the cumulative event number as the time. The vertical pure light red line marks the time of 1970/01/01, which separates the whole catalog into above and bottom plots. The vertical purplish red lines mark the occurrence time of four target strong earthquakes in this study.*

than 6.0 in our investigation. However, for alternative methods that evaluate not only the frequency of earthquakes but also their size, concerns of magnitude transition, and magnitude uncertainty must be considered.

*The "Natural Time" Method Used for the Potential Assessment for Strong… DOI: http://dx.doi.org/10.5772/intechopen.110023*

## **3. Ergodicity analysis**

Some studies have suggested that driven mean-field systems may often display significant ergodic behavior. Moreover, Egolf [25] and Tiampo et al. [26, 27] gave the study that statistically stationary models have the propensity to live in a set of physical states resembling equilibrium. Large events, however, usually reveal the potential to temporarily throw the study target out of balance before it returns to its original state. We apply the method of Tiampo et al. [27, 28] to quantify the base level of heterogeneity and, as a consequence, the predictability of target earthquakes due to the temporal complexity of the small events in the CSES zone, as shown in **Figure 2**. According to the method of Thirumalai et al. [29] and Thirumalai and Mountain [30], it is possible to evaluate ergodicity behavior generally using the Thirumalai-Mountain plot (the TM metric). Originally, the TM metrics may be used to assess effective ergodicity or the discrepancy between the time average of a quantity, commonly associated with energy, at every cell or grid of the system. A simple statistical algorithm, such as the pattern of Informatics (PI) algorithm, may be determined by the TM metrics, not by looking at energy levels, but rather at the frequency of observations [31].

The beginning of the earthquake catalog corresponds to the starting time *t* = 0 and the function *TMn*(*t*), which is computed in the analysis at extensive time *t*, is defined as

$$\text{CTM}\_n(t) = \frac{1}{L} \sum\_{i=1}^{L} \left[ n\_i(t) - \overline{n}(t) \right]^2 \tag{1}$$

*ni*(*t*) is the count of events located within the *i*th grid in the period of 0 � *t*, *L* is the number of grids in the spatial range, and *n t*ð Þ is the average count of events across all cells during the period 0 � *t*,

$$n\_i(t) = \frac{1}{t} \int\_0^t n\_i(\dot{t}) d\dot{t} \tag{2}$$

$$
\overline{n}(t) = \frac{1}{L} \sum\_{i}^{L} n\_i(t) \tag{3}
$$

Tiampo et al. [27, 28] found that, as shown in Eq. (1), all grids or cells in the system are comparable in properties, especially in the case of physical characteristics, and the deviation of the average quantity in temporal from grouped mean number is diminishing. The statement states that if the system exhibits the behavior of "effective ergodic" over an extended length of time, time *t* will have a direct relationship with the function 1/*TM*. With a grid size of 0.2° � 0.2° and a cut-off magnitude of 3.0, **Figure 4** depicts the 1/*TM* metrics of the seismicity from 1970 to 2022 in the CSES area, Sichuan and Yunnan portions. Before 1980, we can observe from the instance of including and not containing the Wenchuan aftershocks that both the Yunnan and Sichuan regions had weak ergodicity and display unique ergodic tendencies. After accounting for changes in the earthquake catalog caused by technology and network problems, Tiampo et al. [27, 28] found that natural seismicity exhibited ergodicity. In this study, when taking into account the condition of China's seismic networks, the regionalized aspects of seismological observation and artificial processing can be responsible for the behavior of ergodicity. The seismicity across the entire CSES area

and two sub-regions exhibits excellent ergodicity since about 1980. Taking the Sichuan region as an example, where the Wenchuan earthquake's aftershocks mostly contributed to the disruption of the 1/*TM* metric in 2008.

### **4. Nowcasting method**

The term "nowcast" has existed for a very long period in the fields of meteorology and economics. It has now been applied to a variety of fields, including stock market trend forecasting, displaying the cloud movement in real-time, and application in hazard assessment of strong earthquakes under the name "Nowcasting Earthquakes" [21–24]. In contrast to "forecasting," which was usually used to produce a probabilistic estimate of future events in the seismic cycle, it often focuses on the identification of the current state of a system using indirect approaches. In the field of seismology research, this strategy has traditionally been utilized in regions with a long history of earthquake records and reasonably high seismic activity. The technique has recently been used to estimate the seismic risk for a number of places, including California [24, 32], Tokyo [22], the Himalayas [33], and New Zealand [34], among others. To compute the Earthquake Potential Score (EPS), the nowcasting technique made use of the frequency of little events that occurred between larger ones in the same or a nearby study location with a similar dynamic development history. The magnitude threshold for small events should be chosen so that all events in the database are a completeness sequence. The key benefit of this technique is that it is easier to use than the direct method of hazard identification in earthquake forecasting and prediction which seems to be difficult to implement in reality.

The study of Varotsos et al. [35–37] suggested that the "natural time" in the nowcasting method is calculated by counting the number of little events that have occurred in a studied location since the previous strong earthquake. Target events are indicated by *M*λ, whereas little events are indicated by *M*<sup>σ</sup> (4.0), which accounts for the completeness of the historical and present catalog. The Gutenberg-Richter magnitude-frequency relationship may characterize the average frequency of small events larger than *M*<sup>σ</sup> but with magnitude under *M*<sup>λ</sup> [38]. The relation of Gutenberg-Richter relation is used here to obtain the average count of events larger than *M*,

#### **Figure 4.**

*The 1/TM metric indicates the seismic ergodicity in the CSES region. The left, middle and right plots show the entire CSES, Sichuan, and Yunnan region, respectively. The top and bottom show the case computed with and without aftershocks of the Wenchuan 8.0 earthquake. The vertical dashed lines reflect the occurrence time of the 2008 Wenchuan earthquake.*

*The "Natural Time" Method Used for the Potential Assessment for Strong… DOI: http://dx.doi.org/10.5772/intechopen.110023*

$$N\_{avg} = 10^a 10^{-bM} \tag{4}$$

where *b* is normally close to 1, *a* value indicates the background level of seismicity. If we use *N*<sup>σ</sup> to represent the average number of little events more than *M*σ, *N*<sup>λ</sup> the mean frequency of events greater than *M*λ, respectively,

$$N\_{\sigma} = \mathbf{10}^{a} \mathbf{10}^{-bM\_{\sigma}} \tag{5}$$

$$N\_{\lambda} = \mathbf{10}^{a} \mathbf{10}^{-b\mathbf{M}\_{\lambda}} \tag{6}$$

then we can determine how many little earthquakes there are on average between large earthquakes:

$$N = \frac{N\_{\sigma} - N\_{\lambda}}{N\_{\lambda}} = \mathbf{10}^{b(M\_{\lambda} - M\_{\sigma})} - \mathbf{1} \tag{7}$$

*N* in this case is independent of *a* value.

By calculating the cumulative distribution function (CDF) with the help of small events with magnitudes in [*M*σ, *M*λ], the mathematical process of nowcasting computation may be stated in terms of the potential for massive earthquakes (target events). The probability density function (PDF) and cumulative distribution function of small events with magnitudes in [*M*σ, *M*λ] within every larger cycle are calculated using the scientific mathematical approach of Bevington and Robinson [39]. The current CDF may be calculated using the present frequency of little events, *n*(*t*), where *t* is the time from the most recent large event. The earthquake potential score (EPS) at time *t* may be used to define this value:

$$\text{EPS} = p[n \le n(t)] \tag{8}$$

The likelihood that the next significant earthquake in our research zone with a magnitude larger than *M*<sup>λ</sup> will occur is determined by the EPS. According to Eqs. (7) and (8), the EPS value will rise over time after the most recent significant earthquake before abruptly returning to zero when the next significant earthquake strikes. It will then begin to rise once again until the next significant event happens. Therefore, in this process, there is just one straightforward method of interpreting the earthquake data throughout the whole procedure, with no fitting calculations for the model parameters. This approach may be used for both a broad seismic zone and a small area, such as a regional area of a city, as there is no connection between the EPS definition and the rate of earthquakes in the area under study.
