z-logo
open-access-imgOpen Access
The weight of the flood‐of‐record in flood frequency analysis
Author(s) -
St. George Scott,
Mudelsee Manfred
Publication year - 2019
Publication title -
journal of flood risk management
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.049
H-Index - 36
ISSN - 1753-318X
DOI - 10.1111/jfr3.12512
Subject(s) - flood myth , 100 year flood , environmental science , hydrology (agriculture) , distribution (mathematics) , statistics , frequency distribution , physical geography , geography , mathematics , geology , geotechnical engineering , archaeology , mathematical analysis
The standard approach to flood frequency analysis (FFA) fits mathematical functions to sequences of historic flood data and extrapolates the tails of the distribution to estimate the magnitude and likelihood of extreme floods. Here, we identify the most exceptional floods in the United States as compared against other major floods at the same location, and evaluate how the flood‐of‐record (Q max ) influences FFA estimates. On average, floods‐of‐record are 20% larger by discharge than their second‐place counterparts (Q 2 ), and 212 gages (7.3%) have Q max :Q 2 ratios greater than two. There is no clear correspondence between the Q max :Q 2 ratio and median instantaneous discharge, and exceptional floods do not become less likely with time. Excluding Q max from the FFA causes the median 100‐year flood to decline by −10.5%, the 200‐year flood by −11.8%, and the 500‐year flood by −13.4%. Even when floods are modelled using a heavy tail distribution, the removal of Q max yields significantly “lighter” tails and underestimates the risk of large floods. Despite the temporal extension of systematic hydrological observations in the United States, FFA is still sensitive to the presence of extreme events within the sample used to calculate the frequency curve.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here