Premium
Data‐driven modeling from noisy measurements
Author(s) -
Gosea Ion Victor,
Zhang Qiang,
Antoulas Athanasios C.
Publication year - 2021
Publication title -
pamm
Language(s) - English
Resource type - Journals
ISSN - 1617-7061
DOI - 10.1002/pamm.202000358
Subject(s) - overfitting , noise (video) , computer science , sensitivity (control systems) , interpolation (computer graphics) , algorithm , machine learning , function (biology) , artificial intelligence , data mining , artificial neural network , motion (physics) , electronic engineering , evolutionary biology , engineering , image (mathematics) , biology
The scope of this contribution is to present some recent results on how interpolation‐based data‐driven methods such asThe Loewner framework [1] The AAA algorithm [2]can handle noisy data sets. More precisely, it will be assumed that the input‐output measurements used in these methods, i.e., transfer function evaluations, are corrupted by additive Gaussian noise. The notion of “sensitivity to noise” is introduced and it is used to understand how the location of measurement points affects the “quality” of reduced order models. For example, models that have poles with high sensitivity are hence deemed prohibited since even small perturbations could cause an unwanted behavior (such as instability). Moreover, we show how different data splitting techniques can influence the sensitivity values. This is a crucial step in the Loewner framework; we present some illustrative examples that include the effects of splitting the data in the “wrong” or in the “right” way. Finally, some perspectives for the future: we would like to employ statistics and machine learning techniques in order to avoid “overfitting”. More precisely, it is said that a model that has learned the noise instead of the true signal is considered an “overfit” because it fits the given noisy dataset but has a poor fit with other new datasets. We present some possible ways to avoid “overfitting” for the methods under consideration.