Premium
Diagnosis of insidious data disasters
Author(s) -
Lundquist Jessica D.,
Wayand Nicholas E.,
Massmann Adam,
Clark Martyn P.,
Lott Fred,
Cristea Nicoleta C.
Publication year - 2015
Publication title -
water resources research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.863
H-Index - 217
eISSN - 1944-7973
pISSN - 0043-1397
DOI - 10.1002/2014wr016585
Subject(s) - snow , streamflow , vegetation (pathology) , energy balance , data collection , meteorology , field (mathematics) , water balance , environmental science , work (physics) , hydrology (agriculture) , climatology , computer science , statistics , geology , geography , engineering , cartography , mathematics , geotechnical engineering , medicine , drainage basin , ecology , mechanical engineering , pathology , pure mathematics , biology
Everyone taking field observations has a story of data collection gone wrong, and in most cases, the errors in the data are immediately obvious. A more challenging problem occurs when the errors are insidious, i.e., not readily detectable, and the error‐laden data appear useful for model testing and development. We present two case studies, one related to the water balance in the snow‐fed Tuolumne River, Sierra Nevada, California, combined with modeling using the Distributed Hydrology Soil Vegetation Model (DHSVM); and one related to the energy balance at Snoqualmie Pass, Washington, combined with modeling using the Structure for Unifying Multiple Modeling Alternatives (SUMMA). In the Tuolumne, modeled streamflow in 1 year was more than twice as large as observed; at Snoqualmie, modeled nighttime surface temperatures were biased by about +10°C. Both appeared to be modeling failures, until detective work uncovered observational errors. We conclude with a discussion of what these cases teach us about science in an age of specialized research, when one person collects data, a separate person conducts model simulations, and a computer is charged with data quality assurance.