**Current State of Art in Earthquake Prediction, Typical Precursors and Experience in Earthquake Forecasting at Sakhalin Island and Surrounding Areas**

I.N. Tikhonov1 and M.V. Rodkin1,2

*1Institute of Marine Geology and Geophysics FEB RAS, Yuzhno-Sakhalinsk 2International Institute of Earthquake Prediction Theory and Mathematical Geophysics RAS, Moscow Russia* 

## **1. Introduction**

Despite of over a century of scientific effort, the understanding in earthquake forecasting remains immature. Moreover, even the theoretical possibility of earthquake forecasting is debatable. Especially problematic is a possibility of an effective short- and intermediate-term earthquake forecasting. The aim of this paper is to present the new evidence in support of possibility of the short- and intermediate-term earthquake forecasting. This possibility is shown through the discussion of seismic regime in the generalized vicinity of strong earthquake and through the description of an experience in an earthquake forecasting in the case of the Sakhalin Island and the surrounding areas.

USGS/NEIC catalog and Harvard seismic moments catalog are used to construct the generalized space–time vicinity of strong (M7+) earthquake to reveal the robust typical long-, intermediate, and short-term precursor anomalies. The very essential increase in available information resulted from this procedure gives possibility to detail the character of precursors of strong earthquake. The typical parameters of the fore- and aftershock cascades were detailed. A few other revealed precursory anomalies indicate the development of softening in the source area of a strong earthquake. The set of the precursory anomalies indicates the approaching of a strong event quite definitely. Thus one can conclude that the effective short- and intermediate-term earthquake forecasting appears to be possible in the case of essential increase of volume of statistical information available for the forecasting.

The current state of art in the earthquake forecasting is illustrated by the case of experience in the earthquake forecasting for the Sakhalin Island and the surrounding areas performed in the Institute of Marine Geology and Geophysics of the Far East Branch of the Russian Academy of Science, Yuzhno-Sakhalinsk, Russia. Four examples of successful prognosis (three of them performed in a real time), and one false alarm took place. Thus, despite the evident deficient in available information the results of forecasting appear to be encouraging

Current State of Art in Earthquake Prediction, Typical Precursors and

**2. Construction of generalized vicinity of strong earthquake** 

reduce the available data too much to get statistically robust results.

of the given strong earthquake were taken into account.

mean anomalies inherent to this vicinity.

magnitude M:

Experience in Earthquake Forecasting at Sakhalin Island and Surrounding Areas 45

be put into effect in different foreshock and aftershock sequences. The answer to that question can be obtained by investigation of mean features inherent to vicinities of a large number of strong earthquakes. A strong earthquake vicinity is understood here as a spacetime domain where evolution of seismicity is influenced by occurrence of a given strong earthquake. Using the approach presented in (Rodkin, 2008) we have constructed the mean generalized space–time vicinity of a large number of strong earthquakes and examined the

We have used the Harvard worldwide seismic moment catalog for 1976–2005, and the USGS/NEIC catalog for 1968–2007. In both cases only shallow earthquakes with depth H < 70 km were examined. Two subsets of data can be used, first one includes all earthquakes from the catalog and the second includes stronger earthquakes that are only completely reported. Below we present the results from processing of the Harvard catalog using the first subset of data (all reported events) and the results for the USGS/NEIC catalog using only completely reported events. In the latter case the events with magnitude M ≥ 4.7 were used, a total number of events was 97615. A similar cutoff for the Harvard catalog would

Both used data sets were searched for events falling into the space–time domains surrounding the source zones of large (M7+) earthquakes, with due account for the seismic moment in the Harvard catalog and the maximum magnitude for the USGS/NEIC catalog. A generalized vicinity of large earthquake is understood as a set of events falling into the zone of influence of any of these strong earthquakes. The zones of influence were defined as following, see also (Rodkin, 2008) for the details. Spatial dimensions of the zones of influence for earthquakes of different magnitudes were calculated from the approximate relationship (Sobolev & Ponomarev, 2003) between typical source size *L* and earthquake

 *L* (km) = 100.5M – 1.9. (1) In the examination below the earthquakes located at distances within 7×*L* from the epicenter

For constructing a time vicinity of strong earthquake we used the conclusion that duration of a failure cycle weakly depends on earthquake magnitude (Smirnov, 2003). Hence the simple epoch superposition method can be used for comparing the time vicinities of earthquakes with close magnitudes. At the figures below all earthquakes located in the area 7×*L* of the corresponding strong event were taken into account. This choice allows the most complete use of available data. Negative consequences of this choice are a lower statistical significance at the edges of the time interval because of shortage of data there, and a false effect of a systematic growth of a number of earthquakes towards the centre of the used time interval. However these errors can be taken into account, so they do not distort the results. The generalized vicinity of large earthquake which was constructed contained more than 60000 earthquakes for the Harvard catalog and more than 300000 earthquakes for the USGS/NEIC catalog. Such a big number of events resulted from the fact that one and the same earthquake can belong to the space–time vicinities of different strong earthquakes.

enough. In any case they are much better than they could be in the case if the seismic roulette model would be valid.

In the early 1980s a few examples of successful earthquake prognosis were known, and the final successes in decision of the problem of earthquake prognosis seemed to be close. But the substantial increase in a number of different sensors used in earthquake monitoring, and the corresponding increase in available information didn't improve the quality of prognosis. The situation was discussed widely in the 90s, and the dominant opinion elaborated by the world scientific community was quite pessimistic. An earthquake generating system was found to be very unstable. A minor change in parameters of such systems can significantly change their evolution; as a result an effective prognosis of behavior of such systems is impossible. Thus an earthquake prognosis was declared to be impossible (Geller, 1997; Geller et al., 1997; Kagan, 1997; and references herein). Despite of this dominating opinion a few groups of researchers have continued their investigations in earthquake forecasting. First of all the effectiveness of the suggested earlier algorithms of strong earthquake prediction was tested in real time. The results of the use of the M8 and Mendocino Scenario algorithms suggested earlier in (Keilis-Borok & Kossobokov, 1986, 1990; Kossobokov, 1986) were examined during more than twenty years. It was shown that the results of prognosis were significantly better than it could be in case of a seismic roulette procedure (Shebalin, 2006; Kossobokov, 2005). However neither these algorithms nor the other ones tested at shorter time intervals (Sobolev et al., 1999; Papazachos, 2005; Zavyalov, 2006; and others) showed results quite suitable for practical use. There were substantial probabilities to miss an earthquake or declare false alarm.

Low efficiency of earthquake prediction is connected to extremely irregular character of seismic regime. Due to the high level of irregularity of seismic regime parameters of earthquake precursors are vague, and even the very existence of precursor phenomena remains debatable. As a result in the absence of well known precursors any algorithm of forecasting based on the use of these precursors could hardly be very effective.

Thus verification of used precursor phenomena is an urgent problem. A precursory process and occurrence of large earthquake is commonly treated as an example of critical phenomenon (Akimoto & Aizawa, 2006; Bowman et al., 1998; Keilis-Borok & Soloviev, 2003; Malamud et al., 2005; Nonlinear …, 2002; Sornette, 2000; etc.). Many of the precursors used currently, such as development of foreshock cascade, an increase in correlation length, and an abnormal clustering of earthquakes, are expected to occur in critical processes. Moreover, some of these precursors came in the use because the process of strong earthquake occurrence is treated in terms of the critical phenomenon model. In this situation a natural question may arise: to what extent are such model processes really typical of scenarios of occurrence of large earthquake? Romashkova and Kossobokov (2001) have considered the evolution of foreshock and aftershock activity in the vicinities of eleven strong earthquakes occurring from 1985 to 2000. This examination has not supported the universality of powerlaw growth in foreshock activity toward the moment of a large earthquake. It also turned out that the aftershock sequences in a number of cases differ significantly from the Omori law. As a result it was hypothesized (Romashkova & Kossobokov, 2001; Kossobokov, 2005) that scenarios of aftershock sequences deviating from the Omori law can exist.

It seems natural to ask whether the observed deviations of the seismic process from the theoretically expected universal scenario have a stochastic nature or different scenarios can

enough. In any case they are much better than they could be in the case if the seismic

In the early 1980s a few examples of successful earthquake prognosis were known, and the final successes in decision of the problem of earthquake prognosis seemed to be close. But the substantial increase in a number of different sensors used in earthquake monitoring, and the corresponding increase in available information didn't improve the quality of prognosis. The situation was discussed widely in the 90s, and the dominant opinion elaborated by the world scientific community was quite pessimistic. An earthquake generating system was found to be very unstable. A minor change in parameters of such systems can significantly change their evolution; as a result an effective prognosis of behavior of such systems is impossible. Thus an earthquake prognosis was declared to be impossible (Geller, 1997; Geller et al., 1997; Kagan, 1997; and references herein). Despite of this dominating opinion a few groups of researchers have continued their investigations in earthquake forecasting. First of all the effectiveness of the suggested earlier algorithms of strong earthquake prediction was tested in real time. The results of the use of the M8 and Mendocino Scenario algorithms suggested earlier in (Keilis-Borok & Kossobokov, 1986, 1990; Kossobokov, 1986) were examined during more than twenty years. It was shown that the results of prognosis were significantly better than it could be in case of a seismic roulette procedure (Shebalin, 2006; Kossobokov, 2005). However neither these algorithms nor the other ones tested at shorter time intervals (Sobolev et al., 1999; Papazachos, 2005; Zavyalov, 2006; and others) showed results quite suitable for practical use. There were substantial probabilities to miss

Low efficiency of earthquake prediction is connected to extremely irregular character of seismic regime. Due to the high level of irregularity of seismic regime parameters of earthquake precursors are vague, and even the very existence of precursor phenomena remains debatable. As a result in the absence of well known precursors any algorithm of

Thus verification of used precursor phenomena is an urgent problem. A precursory process and occurrence of large earthquake is commonly treated as an example of critical phenomenon (Akimoto & Aizawa, 2006; Bowman et al., 1998; Keilis-Borok & Soloviev, 2003; Malamud et al., 2005; Nonlinear …, 2002; Sornette, 2000; etc.). Many of the precursors used currently, such as development of foreshock cascade, an increase in correlation length, and an abnormal clustering of earthquakes, are expected to occur in critical processes. Moreover, some of these precursors came in the use because the process of strong earthquake occurrence is treated in terms of the critical phenomenon model. In this situation a natural question may arise: to what extent are such model processes really typical of scenarios of occurrence of large earthquake? Romashkova and Kossobokov (2001) have considered the evolution of foreshock and aftershock activity in the vicinities of eleven strong earthquakes occurring from 1985 to 2000. This examination has not supported the universality of powerlaw growth in foreshock activity toward the moment of a large earthquake. It also turned out that the aftershock sequences in a number of cases differ significantly from the Omori law. As a result it was hypothesized (Romashkova & Kossobokov, 2001; Kossobokov, 2005)

forecasting based on the use of these precursors could hardly be very effective.

that scenarios of aftershock sequences deviating from the Omori law can exist.

It seems natural to ask whether the observed deviations of the seismic process from the theoretically expected universal scenario have a stochastic nature or different scenarios can

roulette model would be valid.

an earthquake or declare false alarm.

be put into effect in different foreshock and aftershock sequences. The answer to that question can be obtained by investigation of mean features inherent to vicinities of a large number of strong earthquakes. A strong earthquake vicinity is understood here as a spacetime domain where evolution of seismicity is influenced by occurrence of a given strong earthquake. Using the approach presented in (Rodkin, 2008) we have constructed the mean generalized space–time vicinity of a large number of strong earthquakes and examined the mean anomalies inherent to this vicinity.
