Premium
Multivariate predictions of local reduced‐order‐model errors and dimensions
Author(s) -
Moosavi Azam,
Ştefănescu Răzvan,
Sandu Adrian
Publication year - 2017
Publication title -
international journal for numerical methods in engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.421
H-Index - 168
eISSN - 1097-0207
pISSN - 0029-5981
DOI - 10.1002/nme.5624
Subject(s) - multivariate statistics , curse of dimensionality , parametric statistics , mathematics , computer science , dimensionality reduction , algorithm , dimension (graph theory) , univariate , gaussian , parametric model , mathematical optimization , artificial intelligence , machine learning , statistics , physics , quantum mechanics , pure mathematics
Summary This paper introduces multivariate input‐output models to predict the errors and bases dimensions of local parametric Proper Orthogonal Decomposition reduced‐order models. We refer to these mappings as the multivariate predictions of local reduced‐order model characteristics (MP‐LROM) models. We use Gaussian processes and artificial neural networks to construct approximations of these multivariate mappings. Numerical results with a viscous Burgers model illustrate the performance and potential of the machine learning‐based regression MP‐LROM models to approximate the characteristics of parametric local reduced‐order models. The predicted reduced‐order models errors are compared against the multifidelity correction and reduced‐order model error surrogates methods predictions, whereas the predicted reduced‐order dimensions are tested against the standard method based on the spectrum of snapshots matrix. Since the MP‐LROM models incorporate more features and elements to construct the probabilistic mappings, they achieve more accurate results. However, for high‐dimensional parametric spaces, the MP‐LROM models might suffer from the curse of dimensionality. Scalability challenges of MP‐LROM models and the feasible ways of addressing them are also discussed in this study.