Highlights

Extension of classical attenuation laws: effects and implications on the probabilistic seismic hazard analysis

Università degli Studi di Napoli “Federico II”,  Dottorato di Ricerca in Rischio Sismico XVI ciclo, 2004
Vincenzo Convertito

Abstract

Earthquakes have always represented a threat for the humankind especially in tectonically active regions where their strength and frequency of occurrence particularly affect the urban centers causing the loss of human life and damages to the structures. The need of reduce and mitigate the effects of the earthquakes has led scientists, in particular those involved in earth - sciences or engineering, to develop more and more refined techniques to face with the seismic risk analysis. The primary goal of a seismic risk analysis is the estimation, for a given timeperiod, of probable losses expressed in terms of human life or damages to the structures due to the occurrence of one or more earthquakes that can affect the site of interest. Thus, due to its nature, the refinement of the analysis' s result strongly depends on the degree of knowledge about earthquakes rocess concerning both the generation of the seismic waves at the source, their propagation and their interaction with the structures. Two are essentially the approaches to face with a seismic risk analysis: the deterministic approach and the probabilistic approach. In particular, in the present thesis the probabilistic approach is considered. Apart from the adopted approach, one critical point in the seismic risk analysis, concerns the capability of predicting the expected ground motion at a site due to the occurrence of a given earthquake at a given distance from the site of interest. In fact, the formulation of design criteria for new structures or retrofit criteria for the existing ones, depends on the estimates of the expected ground motions from earthquake during the expected lifetime of the structures. Several techniques exist to simulate the ground motion at a site ranging from more theoretical to empirical or stochastic approach. The selection of the most appropriate technique depends on the adopted approach to the hazard analysis and, essentially on the available database. In particular, when the probabilistic approach is adopted and, whereas a large and complete database does exist, the most used ground motion estimation technique is that obtained by empirical laws. These empirical laws, called attenuation laws, are essentially equations to estimate the ground motion at a site as function of parameters influencing its value. The most used parameters are the magnitude, the source-to-site distance and coefficients to take into account for site effects. Due to the limited extent of the present day available database, attenuation laws cannot take into account for detailed aspect concerning the earthquake and propagation process. However, in most recent studies some efforts have been made in order to introduce in the attenuation laws parameters to take into account for the style of faulting (Abrahamson and Silva, 1987), the fault geometry and the directivity effect (Somerville et al., 1997; Spudich et al., 1997). These approaches are generally based on the introduction of simple discrete coefficients inside the attenuation law formulation and, although the results have been encouraging, some opening questions remain. The first regard the fact that the reliability of the proposed approaches relays on the database and on the adopted regression technique. The second regards the reliability of the ground motion estimates when the attenuation law is used in tectonic regimes different from those on which the data have been collected. In the present thesis we propose an alternative method to introduce physical aspects of the earthquake process in the attenuation laws. In particular, the focal mechanism and the related radiation pattern have been analyzed. The proposed approach uses theoretical multiplicative coefficients to correct the ground motion estimates obtained using classical attenuation laws. They are aimed to overcome some of the previous limitations introducing the focal mechanism as a-priori information in the probabilistic seismic hazard analysis. From a general point of view, the proposed methodology has two finalities: the first one consists in verifying the effect and the influence of the focal mechanism on the hazard analysis result and the second one concerns the extension of the concept of design earthquake, generally limited to the magnitude and the distance, to this source parameter. The last analysis is performed using the de-aggregation technique that is explained in the Chapter 2. The analysis followed four major lines:  1/ in order to be able to modify the classical analysis and test the correctness and the effects of our hypotheses, we implemented our own code for the computation of the classical seismic hazard analysis. 2/ a first corrective coefficient is formulated in order to take into account only for two of the three fault mechanism parameters, i.e., the dip and rake angles. In order to test the assumptions used in the formulation of the corrective coefficient, we performed a data comparison analysis. We compared the estimates obtained by three different attenuation laws for different earthquakes with different focal mechanisms. The selected attenuation laws differ in so far as one does not contain any coefficient to take into account for the faulting style and can be thus modified by the corrective coefficient, while the others contain this parameter. 3/ we compared the result of the hazard analysis expressed as exceeding probability curves for a given set of seismic source zones and a set of sites, when the a-priori information is used and when it does not. 4/ we used the de-aggregation technique to obtain the design earthquake and to extend it to the focal mechanism. Also in this case we compared the result obtained for the two cases and, moreover, verified how the introduction of the focal mechanism affects the classical definition of the design earthquake. The same analysis has been performed extending the corrective coefficient formulation to the third focal mechanism parameter, i.e., the strike angle. This allowed to take into account also for the relative source-to-site orientation. As shown in the Chapter 4, the introduction of the strike angle, compared with the case in which only the dip and the rake were considered, produces as further effect the loss of the regularity generally characterizing the hazard maps. This regularity is a direct consequence of the assumption of a uniform seismogenetic potential into the selected seismic source zones. In order to be compatible with the probabilistic approach to the seismic hazard analysis, along with the formulation of the corrective coefficient, a further probability density function has been introduced in the hazard integral. From the seismic hazard computational point of view, this pdf allowed to formalize and quantify the a-priori information concerning the most probable focal expected mechanism. On the other hand, from a de-aggregation point of view, it allowed to retrieve the most probable focal mechanism and thus to extend the concept of design earthquake. 

 

Full Article

{mosimage}

pdf download