z-logo
Premium
Physics‐constrained non‐Gaussian probabilistic learning on manifolds
Author(s) -
Soize Christian,
Ghanem Roger
Publication year - 2019
Publication title -
international journal for numerical methods in engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.421
H-Index - 168
eISSN - 1097-0207
pISSN - 0029-5981
DOI - 10.1002/nme.6202
Subject(s) - lagrange multiplier , probabilistic logic , markov chain , mathematics , mathematical optimization , gaussian , algorithm , statistics , physics , quantum mechanics
Summary An extension of the probabilistic learning on manifolds (PLoM), recently introduced by the authors, has been presented: In addition to the initial data set given for performing the probabilistic learning, constraints are given, which correspond to statistics of experiments or of physical models. We consider a non‐Gaussian random vector whose unknown probability distribution has to satisfy constraints. The method consists in constructing a generator using the PLoM and the classical Kullback‐Leibler minimum cross‐entropy principle. The resulting optimization problem is reformulated using Lagrange multipliers associated with the constraints. The optimal solution of the Lagrange multipliers is computed using an efficient iterative algorithm. At each iteration, the Markov chain Monte Carlo algorithm developed for the PLoM is used, consisting in solving an Itô stochastic differential equation that is projected on a diffusion‐maps basis. The method and the algorithm are efficient and allow the construction of probabilistic models for high‐dimensional problems from small initial data sets and for which an arbitrary number of constraints are specified. The first application is sufficiently simple in order to be easily reproduced. The second one is relative to a stochastic elliptic boundary value problem in high dimension.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here