z-logo
Premium
Preconditioned least‐squares Petrov–Galerkin reduced order models
Author(s) -
Lindsay Payton,
Fike Jeffrey,
Tezaur Irina,
Carlberg Kevin
Publication year - 2022
Publication title -
international journal for numerical methods in engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.421
H-Index - 168
eISSN - 1097-0207
pISSN - 0029-5981
DOI - 10.1002/nme.7056
Subject(s) - preconditioner , jacobian matrix and determinant , solver , residual , projection (relational algebra) , mathematics , mathematical optimization , computer science , algorithm , iterative method
Abstract In this article, we introduce a methodology for improving the accuracy and efficiency of reduced order models (ROMs) constructed using the least‐squares Petrov–Galerkin (LSPG) projection method through the introduction of preconditioning. Unlike prior related work, which focuses on preconditioning the linear systems arising within the ROM numerical solution procedure to improve linear solver performance, our approach leverages a preconditioning matrix directly within the minimization problem underlying the LSPG formulation. Applying preconditioning in this way has the potential to improve ROM accuracy for several reasons. First, preconditioning the LSPG formulation changes the norm defining the residual minimization, which can improve the residual‐based stability constant bounding the ROM solution's error. The incorporation of a preconditioner into the LSPG formulation can have the additional effect of scaling the components of the residual being minimized to make them roughly of the same magnitude, which can be beneficial when applying the LSPG method to problems with disparate scales (e.g., dimensional equations, multi‐physics problems). Importantly, we demonstrate that an “ideal preconditioned” LSPG ROM (a ROM in which the preconditioner is the inverse of the Jacobian of its corresponding full order model) emulates projection of the full order model solution increment onto the reduced basis. This quantity defines a lower bound on the error of a ROM solution for a given reduced basis. By designing preconditioners that approximate the Jacobian inverse—as is common in designing preconditioners for solving linear systems—it is possible to obtain a ROM whose error approaches this lower bound. The proposed approach is evaluated on several mechanical and thermo‐mechanical problems implemented within the Albany HPC code and run in the predictive regime, with prediction across material parameter space. We demonstrate numerically that the introduction of simple Jacobi, Gauss‐Seidel, and ILU preconditioners into the proper orthogonal decomposition/LSPG formulation reduces significantly the ROM solution error, the reduced Jacobian condition number, the number of nonlinear iterations required to reach convergence, and the wall time (thereby improving efficiency). Moreover, our numerical results reveal that the introduction of preconditioning can deliver a robust and accurate solution for test cases in which the unpreconditioned LSPG method fails to converge.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here