z-logo
open-access-imgOpen Access
To the brave scientists: Aren't we strong enough to stand (and profit from) uncertainty in Earth system measurement and modelling?
Author(s) -
Paasche Hendrik,
Gross Matthias,
Lüttgau Jakob,
Greenberg David S.,
Weigel Tobias
Publication year - 2022
Publication title -
geoscience data journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.125
H-Index - 11
ISSN - 2049-6060
DOI - 10.1002/gdj3.132
Subject(s) - uncertainty quantification , ignorance , computer science , earth system science , uncertainty reduction theory , uncertainty analysis , big data , incentive , data science , computational model , risk analysis (engineering) , management science , operations research , artificial intelligence , machine learning , data mining , economics , epistemology , engineering , simulation , medicine , ecology , philosophy , communication , sociology , biology , microeconomics
Abstract The current handling of data in earth observation, modelling and prediction measures gives cause for critical consideration, since we all too often carelessly ignore data uncertainty. We think that Earth scientists are generally aware of the importance of linking data to quantitative uncertainty measures. But we also think that uncertainty quantification of Earth observation data too often fails at very early stages. We claim that data acquisition without uncertainty quantification is not sustainable and machine learning and computational modelling cannot unfold their potential when analysing complex natural systems like the Earth. Current approaches such as stochastic perturbation of parameters or initial conditions cannot quantify uncertainty or bias arising from the choice of model, limiting scientific progress. We need incentives stimulating the honest treatment of uncertainty starting during data acquisition, continuing through analysis methodology and prediction results. Computational modellers and machine learning experts have a critical role, since they enjoy high esteem from stakeholders and their methodologies and their results critically depend on data uncertainty. If both want to advance their uncertainty assessment of models and predictions of complex systems like the Earth, they have a common problem to solve. Together, computational modellers and machine learners could develop new strategies for bias identification and uncertainty quantification offering a more all‐embracing uncertainty quantification than any known methodology. But since it starts for computational modellers and machine learners with data and their uncertainty, the fundamental first step in such a development would be leveraging shareholder esteem to insistently advocate for reduction of ignorance when it comes to uncertainty quantification of data.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here