Premium
Statistical procedures for developing earthquake damage fragility curves
Author(s) -
Lallemant David,
Kiremidjian Anne,
Burton Henry
Publication year - 2015
Publication title -
earthquake engineering and structural dynamics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.218
H-Index - 127
eISSN - 1096-9845
pISSN - 0098-8847
DOI - 10.1002/eqe.2522
Subject(s) - fragility , parametric statistics , smoothing , computer science , function (biology) , curve fitting , kernel smoother , measure (data warehouse) , parametric model , econometrics , mathematics , data mining , statistics , machine learning , kernel method , chemistry , evolutionary biology , radial basis function kernel , support vector machine , computer vision , biology
Summary This paper describes statistical procedures for developing earthquake damage fragility functions. Although fragility curves abound in earthquake engineering and risk assessment literature, the focus has generally been on the methods for obtaining the damage data (i.e., the analysis of structures), and little emphasis is placed on the process for fitting fragility curves to this data. This paper provides a synthesis of the most commonly used methods for fitting fragility curves and highlights some of their significant limitations. More novel methods are described for parametric fragility curve development (generalized linear models and cumulative link models) and non‐parametric curves (generalized additive model and Gaussian kernel smoothing). An extensive discussion of the advantages and disadvantages of each method is provided, as well as examples using both empirical and analytical data. The paper further proposes methods for treating the uncertainty in intensity measure, an issue common with empirical data. Finally, the paper describes approaches for choosing among various fragility models, based on an evaluation of prediction error for a user‐defined loss function. Copyright © 2015 John Wiley & Sons, Ltd.