z-logo
open-access-imgOpen Access
Minimax entropy solutions of ill-posed problems
Author(s) -
Fred Greensite
Publication year - 2009
Publication title -
quarterly of applied mathematics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.603
H-Index - 41
eISSN - 1552-4485
pISSN - 0033-569X
DOI - 10.1090/s0033-569x-09-01120-7
Subject(s) - minimax , discretization , mathematics , entropy (arrow of time) , well posed problem , multivariate statistics , operator (biology) , regularization (linguistics) , mathematical optimization , computer science , mathematical analysis , statistics , artificial intelligence , physics , biochemistry , chemistry , repressor , quantum mechanics , transcription factor , gene
Convergent methodology for ill-posed problems is typically equivalent to application of an operator dependent on a single parameter derived from the noise level and the data (a regularization parameter or terminal iteration number). In the context of a given problem discretized for purposes of numerical analysis, these methods can be viewed as resulting from imposed prior constraints bearing the same amount of information content. We identify a new convergent method for the treatment of certain multivariate ill-posed problems, which imposes constraints of a much lower information content (i.e., having much lower bias), based on the operator’s dependence on many data-derived parameters. The associated marked performance improvements that are possible are illustrated with solution estimates for a Lyapunov equation structured by an ill-conditioned matrix. The methodology can be understood in terms of a Minimax Entropy Principle, which emerges from the Maximum Entropy Principle in some multivariate settings.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here