Premium
Demystifying Fisher Information: What Observation Data Reveal about Our Models
Author(s) -
Schenk Judith,
Poeter Eileen,
Navidi William
Publication year - 2018
Publication title -
groundwater
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.84
H-Index - 94
eISSN - 1745-6584
pISSN - 0017-467X
DOI - 10.1111/gwat.12668
Subject(s) - computer science , information retrieval , data science
Information theory is the basis for understanding how information is transmitted as observations. Observation data can be used to compare uncertainty on parameter estimates and predictions between models. Jacobian Information (JI) is quantified as the determinant of the weighted Jacobian (sensitivity) matrix. Fisher Information (FI) is quantified as the determinant of the weighted FI matrix. FI measures the relative disorder of a model (entropy) in a set of models. One‐dimensional models are used to demonstrate the relationship between JI and FI, and the resulting uncertainty on estimated parameter values and model predictions for increasing model complexity, different model structures, different boundary conditions, and over‐fitted models. Greater model complexity results in increased JI accompanied by an increase in parameter and prediction uncertainty. FI generally increases with increasing model complexity unless model error is large. Models with lower FI have a higher level of disorder (increase in entropy) which results in greater uncertainty of parameter estimates and model predictions. A constant‐head boundary constrains the heads in the area near the boundary, reducing sensitivity of simulated equivalents to estimated parameters. JI and FI are lower for this boundary condition as compared to a constant‐outflow boundary in which the heads in the area of the boundary can adjust freely. Complex, over‐fitted models, in which the structure of the model is not supported by the observation dataset, result in lower JI and FI because there is insufficient information to estimate all parameters in the model.